var/home/core/zuul-output/0000755000175000017500000000000015150460640014526 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150464764015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000232556615150464706020301 0ustar corecoreiikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gfԟ "mv?_eGbuuțx{w7ݭ7֫]% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3CF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&ޙ-did˥]5]5᪩QJlyIPEQZȰ<'Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI(cY,[G`µ"_zfzDI9ןYIpS[*BoOw6IyEJ=船/|.1٦@ڗJQ&շy»Q}U-jbkmA B̠YkIP0HM9AieC! >E }qŗ9!2Ȑfk ޜ6UxTHzy ̋6\ByE5]2z!YY5<|!"7 ֕<>:j+d*Y֐ine kB=dn̳᠇̕w%Yd.L ׯop>o.?u_[eyE~=ʘLRu5x5궩e{tXeHdei>QcF{P_OuN.0tARIhj]HwdN/Ӥ!Dd=Б. ͻMpX=^.FڣUCOY 1yCA~SeXp^%ɰϤ Xu{5w?}g9òi97 0n7@]D)y_9ŵ!{\0{jOOR Ph«75RsglaM[1wBe 10Qe'u|2V_;#1izEx9J+}Yf;HjЙkSɐC'_F0/m=?x3ѓ.ۓw)b$eC`JYvxH]09?ux޾:_\эY'. x^=zLk`Eք NWnG?ue#"0ghY elӑjTz5Tu]p3#d>9;YU))XǬ>[S:#J5Pl4IޣJWt,yMVIDZ}DSAvIYut{E/=?Pz ޭ $ǞVmr] bҎ*(#`ݖ.IXAsdw\gu!XJ3F{yE=HVitaI s1L]А18ye"Aԣfή}k`wRK^ t9܇uq:<;Y¿d_չ̳`{\hϼlR4|^>  }XdmF]/R>7':xl q*a| )XQĿ ڳ7Mz\+mBȂXlf}PID3S2 -0_c5ڦ=>bMuFmΙo]4*&;;SمR@q0EFE!9b]D):6ǀwZ&5ź%'-Qcw:֭O~?{xxnٻw͠рwxO."R" (dބ+8.V_j`b :ۍ`m'bP%ݫ~xAlᑌ$ʒ;Ku2c53~݂|g`͉xA dƩ@B򉵁D/I9FN폘z=aft1ݩ٭JtCY}Y> )pRovg\LQ}q_;{? Nzd.=%fml1D4@ #:zf\XrE*faHʟ&1$4_$g(p>b] bEzFwCneoפ[_[&i):i靛ڴ'$"5Ebo~z32h5o uq~c)p5`1$AJi$=GfitrN'$kw" בNGJI+:H1{e,,4.9_RwNy]qgnN~v|`|ҳM}J6:V7Ļ NMI3F.G|4}l3ܴХem_'m;j sGXKYt([mjh ړzw:+7VqXu /wE`E g:c9?Պ̻:UY=2ިi BFޮDƋ/u`5C\l#yXSBzU})`Ѭ z (dOq ~A4[ܶt@ uð_H Bb6} 32ˣD}ݠ \b 7V?oXv%(pI"ibrX7-|}vs;mL?v3q>fx=s_u{5fr7s2 gKOJ~4i x'Yo:q:nqJ(po0w-ginͧΌ6NyZ֞}Vˀ}Vq~f `^aaþ4R"X0"^O%8' |a|aXRb}*(u7~tK_[|db>"sd[y @Hw<KG)`gH$̗͙".Сc+nO8Ah_OyFP{AL /G~ r:º ]5dh^/a݃E C\ba=ʩb>:[lyeXkUY5l$y=rt9!Gw)@x{ehUx|u㫾~a5Z΂ Ur;9<{rL^+(Vz$C6Vj.&Ӎ11oI7yk Ci K3DN)i_m( H͇|1gt1/f7gb;& |D1rPLKR?"]LfK7+aM|Ӟ&gSCS\J8Bƽ2\t BJȔwKHk*M :54Y FSG2ꫧ3p+(U*ךpJǗ|1x1^|o{L[X++>))D{/Yo7nfSs5йmzRګ$ܳw%LռlƘ>ԯg{(Z,1hC'OjD#GVi qZ,wfWd{2R_ܓR;xOu/U>>.M!s}ٽ%1dM@O[gs?3O%`9JXU N dٹ+bl"-r&y((LV}%oV1>P$_MGżs>{&B*F XI XibJ9SȊ$yPVs'SM#[lzyYmj dÓlsp]:k>|u^)3-mejz9J7bz(!dH/f>. P>Ff~3P- <|He^T>~mW[=عW,"8kԘ@6t [ʘEۜ8V9Ogɪ|5 SAbȳoo)QȆEUc}Ox61Ґ Ej陊GZ~R:)pe?nu2>Ho4Ԯ \k]^Mh>喇tQ_gbyLl/uM<}Z^XbDK5{12ly67QI()|*I` єh&J~YG0_A;Ϸ{b_mٝ/e>|JBHOKE)AFzL )fT>_1kU'%|Ix=I1&ḫ'w&Jnl"Mz_.G - /_7/TS5UNsX!$%Gz ƌm7ke>\RRaIV'¶6!=8?8oZY|Uv`Ƌ-vo|J:9[v~\:衡pO`c IcjlX):_ EeV a"҅2jB7S2t=t).\aQcb^CZ-uvpr!(dv^'5|X/nI-D!PltsHDwQ$zzBvQ0h} -_>7޴kӔn,?W1;{|,ۇ=sx+@{l?.*+E>1]8B*0)QsU·BS&vp/Χ6I[Ux8"EȿQGa[qVmSІ Y$9F_h6~߮ )ib+q!EXFucYgcV>&w6?H+NL$]p>I*eOjpcm{Bl9vQ.OT!9U}W冨 ;])>6JdbXԠ `Z#_%.9VF[7Id8:W5>*N>KStE)KS1z2"l%^NEN? _4ILAٍKK7)O,:خc磶Fcݒ”^h*_G juİZsQ ~!GAxg_$A|`e)B QlvKlXt҈t9JXXqdl[r9RǦ:q5E](>Z zZ1&8G9r޴T0=Cj,?V~:3] ;Y[OӬNb1{8+7%L1OUaY쎹aZHgi |D `%޻I>rd31V_Sh])AUqػu\Mڗ鸷A+A.A~&'f2*q0âZEqrO| \56cTAnOFo^ X]joC!Pu!Jm l?Qac_>'"Bg<" 0H_-AnG =q޵^Ų gwpГz]'ť$:fr;M-e ՠNhfG8>Vڇ RAٽ9| cbpcT?x]aU {ӋG ނ1v_/EZ~'*.EΑ9U.ϊ/,9怕:[QcUyUrŽ XRjwflѓ6.ܮCy*8¢O[9bu) O14B`.z͜u-ss>Uݴ SaSK§ GT6&l`GT~ꢰ\0P8_)Z]k5>.1C( .Kp| vä+ kj· dM[a^ $H;M $YǫU>?<UݸoRV >IsawF\b+s~p"eʰ(zZ=.!BjѕFdpUna"Odb *75:&C k1ͤ#O Rۘ– Er/G/UcAPQT]|XN X]^F Ŗ:ޔ&+@,{3T\X)|*HN'e*h0:VumBl ۏ `9AgPF_Ѫ2)sCj1T.S0Z1:?Wy9egI+bK?&#I3X\WGZ3M`AI.pH6xm`Rs _Jt@U8jxɕͽf3[I3G$)ՖbG(}1wL!wVf;L|14jغRqcZRݹJ$]!:YF2cU(7B~ ;Wi+vwv-_@q)+?Dobtm4Sxb(9A `aRs ޶d6'XA5?V_W puȇ;s`uih _F2% [U۴"qkjGX6)_+(f?\T)* &9V(]"tJ8۷)g3J1n`ROu~}#Ѥ#r !J0CQ v⯥ho1=V T:_#OV+kG\8Sz^'툜+OqFǤSCǔl X1\1:" 0mtH,>7>a.fpU`ZR֩bK'`tTiwm* "Qi+ *mDtH-ʐ?sk47iIb3Ώ%TCv}e{̈́=I;iƊc2J1TN>7q;"sդsP[ kW`u!8Rj.2hgWsE.,uDΡ1RºVݐ/CBc˾[ shGI 0Os\l}`pΕ qO-ˠ{'\ QuaBn|L@drVec>$Ȃ1L-'{뭄GdɱL ;V[bp>!n&աI̱Sx!shjuL P Ӧɇ~t#K1pVi8F'+1dc&xF 2侯}>tiDpU`%7iTH .Y[L'y}Jm2$EB"{3cMmhipEI:59sTz?[uvcD-~V,.ȍȱHEB:p`\E)jlȔa|)nɲ"Tq?E8V 7z[v_J~C4>''Rc1-V RtzJ=sۄ`g?7̪ #`u0V<s)/=rnlg9| RD1౱UR}UR,:ơz/lvc& GHwMlF@a4D Oj!;V|aq>0*s%6)L?$ća$."T#yqHhlك&ٕEt_$d:z2-\NR#cDB/eWzH 1Հm -XıUXFr\A-2]6u/ųz Z?ڢV)-&!8vL f2D?#y8Vk[~;DSu>6nQ qf2Lυi l-傊וzF"daf]>HUd6JG`\g2%tJ4vX[7g"zw\k>kqBbB;t@h)Gရ[-rnl-wgpn]Y#&ߓ_SGo_&AJ烠a/f_Œ8aJM6MY(3ͯl~l8V0٪T zL{As:&EXAn 8Ugݗ^Os RTnp{PZ )`S!9| Z*7XibR${˪AokOr\y$ ^X9@:cZ{F_eo03k8dFlJ\l?oIɲci(frq'>UՏubnjPŔMn@Nw1L$Rr!ƃOg^'Whb"#jAxGI ڸLDӝOjdH0UTD^fӝi~fPpua?U[xDS5RDI;,ʤwwM8hcR4+E*o&=Fˠ)z{:'<_fFj0N5$D@{QL(.&Hd-늘y\M1ݍ?USi'kAIH7S7\۵_*B%L达& @b\bKB|"?+ JwudqIwT4)%sxd]ߪKm/Z%qpb#LԨp(y?T 2,#X,|D#"YUu~JCSZ=־s5V(U?3-uxyi <V"t4-ۉLTzwgH9R:nMU2"Hd^](yĥ/x<"|Һ1L6i%o MQ!0("Ex-jVg`XZ&2z%f}q5 Q;Uu|v&lUErqv#VWբ}J,qKk]ayD]bY2x$ǫ82v+k--uEM5g%'auìx̀[/x4)*5Y (q']ug~=MM`]%BrK/.oyxX9W_18Jg/y!+û^+Gu@[;Oxyt%2=֍udz|9/%L]kF>28AGHy`H3xkk=LL`tݓ5}H x'2i83/v1o85OI)=$28RI}uMR J OO1R`J8X=h,wIɫ2\c{4}{tJbT>+QFQx|Zx<%8hMγ$ݙ[yuzLb_g~L`؆2$=Q~i&| Sr|"CT)CAv{FzIs=p(&IHTL{w*E !q;7$2dNhY⽌i1ܙ{q a?4 xOI4\}Y&X}%˶&^E  }bC!3\:љWH{޽<eэ) CF c0ȮݽI2 15do5񨽧q}/zuŭS 5=w媮>oD^sWDF.e„&z-ez'4]۠s7x%~ay&C7G)Yy ua?tVl.O %PmhlJ$*-GjGow4{{wok$uh4|$ԥjw4$Flj*K+>By么:N9}u5{!%l}7I9n7Go<+RA'Ff^Hj>2 1 ѿt-@$*.|-u_ѕ73բw2\I30/Ѫ~6a+ʿx]q !H79Y] gxc 1 l7}#.+ %!8GȚx$\+M?mzu7gr͈.L[3@ qO߽zri<0Q?<Hh>Sn$ Duuo2T<<, qN bQʰu-M><`:M׮+'QKjڕVR^<*A n?<021 5Y+(Zߴ;ήh?!Ѫ&5wgT:ejI>APxGyև4|O^@޹,a @ Oj/K7ޒ(9TuSdTUu nZ]Jz٩)vPeVxw=fZ`p[sUS5pO,sYVF!O < XTc} 蒍!Axnɳ~_Wp]aJ8 (C<ť܈cj. S!cwM%.(0KІ.y A][xRhZTaKLADKvkG <]cۮ1u ˣJcJ1װCɲhKɈfhiܬ_gaڱ,SMD[Dl+G{I}%LM-GWra|R[/iM,U0qTXAq]oÓ)&n'l D=bu h[|kq]/3".~mս\+! Yw>gk[..߾b;>Ѳ.h .X$2c¶y֚2%z1۱caHتX},":2%; ?4PUFѠUN Pm5:'y60w'v%(P55cG+RZr .AFWaK 7psh{|E)S1wxzc_8>o':OrJ!T垭h$=sT{iq"--Wl)eU!hQOTRZaDyUm}à imkmhZ ˅ Қm7liږ[0 1wM/|+`eǾ,7b0u=+ L3t]yUQM41Ґk!0&}sm5z6+)(mT El%"\Q#pUজP+@*,Mjg.ndl=EZӕ}]"(3VJDMm6}<Ԯ񹿟_gM";6uKZ2 On- Vغ~Z4J ړVFzxt[p8}_UVqݎëjke<T뫎.o˼T[6s 6 _Ȧ!1a!J̊u`1mWY&xI1 w|z;.k?A(h*7vfAϗ94$BEKvuՎ{ޑ3quޱ{.n+TXܣkZuL[ 1Lyw duiJ'2WWeMީm]Qqv0AwhzB+vдE/h\IFYwuONyW]%xV5INϺL +oH]HڅC)Z2˛OvXTiU(qѦ,IWegTjc) fByEGϚkݏ,-?[fh|=yL)Žyǘǥ@ڋmӭ@q9 v1c7= ǸksZS#܇ o3 Cy6 a@y֊H{^X~4*ݛ 7f |ZZ>z;Pj 1[OV*/@}i\k:[dOѣJ%Dߠlqp{=aGLb}j= |ކH^zfo-ȀըL[ ݓ&s,%ڴ!EV"#v#6"-݁E+{gedR4H.e/&Y)v!8к-9 Ecd-w9 2$Eafm\YB9ΩlQo4i6mNv/-YmmW )ɔxmSi86Dϐ E3.hpJՓg]ڀq2tnۂ_gN'_1*C31v9f w~B+{mj`SVg4p 緊c; J;30a'0IxB^X 81q6>|Pt5x 'N/a;O@} ei=jvD~:I:)Xi%zg=2b9|Š8l >çql=K2 ߧ췍~U[a9 P 'Wo+\>f]U<*HƋ USDuuΪ?x%5>e_(y6;KwmE WJ? 2{w4F+d N?BPl3{w xMr `ϧ mCT3 xP Ip= #(|nO:rkÛ|^v0h/9hŢ=J_< P fdE8 QOv T0!] ;ϔ8Qd.Qqbu@ C1jf}˖PNO6Z# v}gM, ~~d]WGg*ϗn?W a!qx} @lnлB4uGyBN4oWS~Z.C!\) "#߇B. q51{S6o~\/b&(8(@9=I(ѓUX QwA09@=W]Afl ;5ukPѵYF?kf)hao2~Eu2˲,K-Uʏmu/2b/R(e ! vl`BE"?%S s̞Ew`>*fvcn' L)̼nf{=)Puz0G*ҏfusbXV'aC{( P^D2G+]91G09 F0,kK`^mT5s =i.st7i̘Z΃JӤ3'Kkߐ]~~'?̞]2c :JelG3zĀXO*"))G2O倨3 'QKL%)U6S)hQ*:>T⼻kfiNwS S 0/EVuCT♟"ጩ5w[ Z*ђΏNX?w\K+FLP<ԷMI 3E\HZ<{Q9 YGWm,԰-o6G8 }na<'f^E^h$)#ms6!ɷ6]$hlwa1fXc?#A*o; aaM 'pȿ |pI~sv|Pj:j`AY`!ND` K2I!xpaW"fy 䱉S5&B ӄ-_2 ӂ|J}Mu1:,~fT(n~P-83m`zIIQXM_$,U?Hs>z8KzD 3^Jio+^Xeh,v(Du$|X M|$$YrONbfW$GZy>⍂`̣%+BǰLGFqGE^Zx~#$ezvp2C$ ,,V|]fPA/&|V!AE:pmPfWsG[חdX8!hF,vsYϒhˣL'-ZŽ.mI o3)|t3rGI!\;4M[avjw/d.>~ʘ|PӁQ$3f j!qv`NckR ^c=wTߍ8 tCvcSY33Q&Âa ^6nXe C4( jIİ}62xDGЅh2|A<2Lq"FҩH[-~v|fu-~ϋ 4e\ra>K2{~\kLǬ^dIͯۯf-vokXÅUbm@!bHAo]{K#گ +/m/~J8/N)yomR]KW Nf|:vL0"<rq$͍>nwϲq|0Izx6;s(^13#MfMro'0<߃˻&ܕo5 CdRϠ /1uF|'1 YD ozlH<7&&Gq ghvEh캺e`3Ŗ/ 434WYUKfpm4;`ڳ 푟ò> M-n\fpj/yG;iv'Kr_RWc0F>\?y[zC!8I 0v(k+Qֲ3u=ϳv{\6E|2[\,֖ZqKoiZ̦iʀ_y83;Z0d9/.|)m{ 6-\nxkNZq5׍ =8120[(q0D5_`  Lcv`ҫN׺GG0G0l3-A$.aX~sґc,z* CaL<,}_7Cf{l>?D'WZJ2S陖T2:fH Dw<;U?čv..P&RP:<;HE2˕ْCEm^uy+A *CƥC.P'-2U#)p>P9R2qv(D4A:4+rtg%$X2$tL:A,P\+.59Ƴ(Ϩ3#qt$iZu[\:e"IJ7ghxxV]+Z y!s|i_/yLnAa]~$yRF*;5UFwVahQ5$ȵJOCT~Xt4RVfla]SI)i,<étzE,bR9XQ~5#"yQ^w 'WZ^_MHEyL ua!@ `k`9LbbNa!)0t+_Ab4uJ!}j3 ! T ˠ,+0鬸2ۤ÷InJSPR.ks~3|xxϺ9EA%%냢s`H/ou@$ 'K5Ԗ%w&ʉnңom4ie?jM]7)R."؏b2yPJY=_VR%W۰]VeG-g})"N3NMYSLAanBmEW@%JK!wt>w]y}bnN.0Z;0F6vMp|k3hNd ǸT1.5uPl!̈́gx3dl&5]{ lMԂҰy5V2iLNzO7nOIOKQ,n`!4 -y%HX,kmHίbGj! X @}8H~T"-n

/; ]l)=uowpMne{BZIƟڣ:۟>xb &;u;;2>_?n4릓Nx^"mm_bX mkw>q+n/oo7vW0us #ډFk/?6ov!Jp~r wmcɧt.[On+%>Ve?փ>ׄ1:YB/-ľcQzgڊg'uJ {1ˮfbQ8dsd5­)i j W?`08Lv5=9X^o&ig0AjNF@AsŚӤ,Ue6 l)F ˻dl,J0a(pd+QSP%]>48# CÉ_BqmX]ks5U*QX.po&(EC](e){ϙ(Qvow?`$Q^7֧mԺ2eg3 pƕ E&8bĐ+z(vRoYM-3]Ub1Ȫ:BtF =vʍuf@H8cd%μ.j=SU!'yڰrAx|Ow\lT9XcK1I|N6F$6[MC!x0g$H05Wr̘! سtRQ-I$i:R N:m> H4 uJ" B%gY4r İg=;Cq)5MAgC(3BoMS?`$C=^fL{VV3m`>@ZkF( ㌑D蓝!C@Ңh&ѩ̃MH b0P_`jeDrBG*&.KHH"8?8Q֍=Ӛ>dTbSDa%T:|m }m䱑ࢎ-I Z-8mA9Κ֙> ٬mH:n^Ip\‚@Ih 1Z0nZc)J$Ig8~v$8h\I3$d[hLjiI8/DE \H]+S-GH (P.I!2Fҹ; b0+CoPZ1AE∐Rodȶ@izUZ}?mt[/7fZT#0Tg\(aĐUB s܆*w"[XW"Id:k*_ԠLÙAf >6%H qq.!.xÈvN(=H|m򫽏EW&JZ\͸4#. vbp)n$6x7 3{׬}$э*!Sr~)yc^IliZGňa3\UxpQmg&$Aprp0Ëyx%WqȏFXWѪ<@ݤddVpLA^#$xf}i=+$PrpJ:x&6pݛCR>KTC0Npič(-*ݻ_TK;]CE^ chUUFU'. 2)4KGgVBFlGH:~E78j1xTTy,~d`&LLE K F>J\U3࢚9cKUZ% jҲ45 "F fiΎ6Z53:t܋ tL"α7QE-zx= \ B FB" !d@wIg.Bb&0|~| ˒NK{OLfvؑӥ&jӥfcO3 I^}jK712U2X #-ɛqQ_M`l `4ZkU؇kb1bAO.=Dc߷x}2?-G^@ߍG\:h'Pډi?fH{eKѴFn :YYDgP\LI 3xFs$8o^>7Twޘp֦4Lt A'REI%3\}ZgNv#&  0^3<>[x”MĪ99,2̛bY 7]z:Ce쇝%]6ޠCx uUe(&gKn?GK:#xsrHp}r$3Gnqz?xdHfV7 WәC@bR` ┵L" Y1n<*7rҝq#z7'2YexS$ wUnQT?`9%8scxh bL3xAh8~]`//sSβfVYl ! kx!(RQ Հ/XGT`r')(=yxWO9k e |ڜK7D/ռnMB%9`bR# G.̝O{nGm<jXVal1d6A &6$4dW֑(Gd(YLKX״aIlz[02U͞:`fscWӸ}Z;X9aF mw/$8: F;u^I|WNUp²PuX lb7~]]A"Hq]Sh'A4j Зp6b(8go^Z\/SjO CZN{f`Ko u5tM$A[t^/r0_hx .آjưK"T:5$ I"ӚY() !_g.8q`.:?$! ,b u)u1Ub:\ǿ?6$8Rm\tJ@NV2L| -ȕI+Zޫ8;b:{>Hc] j_V]+wU('iQpK/};;Y1h&kR*cIDͦ "x07}3 oKFR9,=zOh֍}O6j%{Nﺼ<4ƅ7L ӅƙR`u p(IBl%KG3O9Ɉd"Υɬ'm6-w낗Ά&O~`ɾRr.`&R^ud ܃RVP+.W+hܾ>d@s`hwtA> =?Hp܃})䑭Y/Ô.Vق({4OC侘r Wl| % ެZc)F\k΢Y~c3[YQ@U=MWO촿*쎾9~N DZDu%3wRr $QJv]՞$̒8֒+|.9$~"`K擶kL|yI1`#X~K!{*x--a$h2z=g|/]!vtfb:\!|֓`LK{߶WK<p !f^/ 3bo*^(ŢRV6$.BXO$8N/%`yaAPzz?x2XZû&|Hp<>q)8leKRFԃ]:AqZA Qq}&q#1<Ly1F R6Inf:6g2=(֙Cmyo\ύ욃oZ8#?l?+Jc F+LtCl\W \79ʞIޘDJ:.-Ny9L 5J3uL]:\fO7"&g77BH.#̚#ь'^o8J@j_#K Fw{>BVR<4,RfUX&UtWf/##OB\}R4D!k8d{Z(ѮMyeэ1>kSzSpGw}cH3  xFiCؑw m0d<$I@{ P_ם!T[Wtv7fyUI|oF&0a N(Ι;pZ1.0ˠ4 6nd^uI.v|slmi-%;^D*yVPR0=mGDjUj]\ 8'h90RAQ^yV5]hB QNb4Ty;&z~Ք1Ĥ2q?FvYs0WRLJxMԹv%3㱱W] pM۽0SFpV8C/0UwO;Pwi?< q|8M3Xi(;/&#f&;kRr{1ʦS ))4s=Sp*(sG?*sz5xPllXouzlj46)~4:3b$Ew+o[ooٕxُ~ܛ$/\·{N*)&_x,>bsY:\eV r-@}?*oVP,.Zmz]ϔI޹˓n7YЈ`1hؔ'1?(uDw_O2 OQ$sOΨ:ƈ+zL0 ?Q~OxGԓr9! 1W2?L.c۫G] m:% 7Қe4QUUv9Uqb'Q r aBߵ^y 睪8}>ŋ7:^[<􃓬"hs[DM@|JRTͰWI aBaB\DHGZ5[};Rd-^ܐ,z9hFS#o\;{N'CK ԛE5o~&gJ5PMY:Mgc y}(;T4;htyĒƹ|LМ8zn6V1 W?Nqև}t#I,msQE;/3)Of_L_-AH3Lcj\D|Yn.NM훺SƩTTOıȳ e'?3aToo_N)bU\J$KыO$jzv1 6M3xW- z= h2;R 5}ym`0Pyu}EUm~84Oo 6B0jtY<. EYh⁹qg|y12㸙.QRgѼ)ZWϪ,Qgk?x'ijN:R?PR"<BS,UTGPKwz ճR?;k(n})m㓻ǿ=㤣iT>/A |~wwtZ֓JlM=ץJkix*$ZfqQ%[D\T&p5Ca cͭۉ un͝` 3g;]^N"n"mp%oMNhӊ3[k8H"܆n8oWP\uC%vJsP 6y$E 4nbX1$DPHV $nx݁K`m9ޠlמ4Lxhܪ;[V0$ ǶV^a?k "瀓~I ݽ~_9wmUτNf睸6^qɾyPkRm=OƝGqycTvyzdϮpG"xs7WU},}Y)ۚD ƌzE})P;cwH3S^hoSX};ğbF ,+{FZu4)ʡ5l7,{] ;G=A`k̵bKʼn炂;6_V4- q+6,m5΃qXN$ ý:,ܡx,<8s6YRmWQgXKn68A c}þjɲҟnmǠuaҾ92tnD׈r@ߗxka352fjkL:.e0&У+78֚IKux!v~8ϊ<\}nL\Wv`VX)ƸES-`ўhj.[)% vق)znײVQ\8WGIu0n\O_w7x~s4L/x~Uզ{b'/?7R[b55QۿE+*ۏaK?E^-ݿ|d޲-7-A̘ޝ%GlOѦ#ge0u 1:^k~wz._cDt3)ְQ%55=ob' ?x8>'^U21^u\ _QY]bq2:q܊\b9be*gcr?Ri@eȭ#^0ř #$&r W&Eʟ-Tb5T |ƭg.vۼ抦`Nޮz!!ܑzPKtk6^a&oVAI,Ǥ 0d124ePkNkomo_ }i8_~1>}uNQ0(8ʱE6u5:ai^9Wl>"J زM.j&Y)XUF~@X*#afד08eTio"*^` *@ɭ1EoMZp>Ȗk8}~/"K?8OgzRv`~7'S)5[g?Y'K;vnd^FORG(?Д͡L%:rP]UY7/=%nMDfƪ[: 9w 1;e,`Vq#ͩXR,3; SX)ה,<%#gN6P_͍k&$03$B`0 VVyHf=ޠEti|@α4J]p P<>pƭNx֢3N@SiyƨJ$"ChejOk=3yqcl7 rցj)"phq6^1o<5`vBֵ xIwg k,kJi00euFRYSUV7s1gD- r43|>+"4 iFY8DiyE-ǹ^)4Qȸ4Wv>)1qBY)fֈ+0$1 rFG$G@ qI1>fU 1sbI6WsÊZ `ٞj,.+ qfy7 g!4gd+/fa>g^.2 &˛cr,SU-=*JcіHBUK80ǦHbQFG"iyLuǔrsm\E8!>-0AH ahL$dR(H!) ?KN1( % _.0Ss;p+i"{kMQ#ㄞR3LLiO0pYkf@K-lm EGD |{ h\ҘidV+ $qREGa,)5 mQm~Ƒǝlg׫jk2B 1CGQ.O󃑿@XdgGelM(-S5nx‡rz^9Jxk8Y,_fb5pߜUёxgg6$/hEnҳ:h`_x`763(l>C(IPb~ӵh5*SUo{%"B+ ), ?ВJt|S} 0cGo!LIU` ZS+D[4 `I[3[3Lz&k1k³Ŗh1E.0!~vc\ϹYPW'_i-bʇD:ozWW& їbQօ9Fx n]Qϝ`r{7~\Sj잋B5w @QQF LAWv}d9em\*XRb0fGcI}'3W-g,aJͦ 2 yrz/'5E%;ߵ÷A {?,ƿs~v0iQ< ͥYLދ8IֱQ#ȾA_eA96Q#twދV^V M&L#?}͟Y od+4Nj#+V-p*Q-}Zqeٿ&r$"#|y;i'FH>L&%S8oh>#^M :=NoʠF`K> 2n+L1<|X{x";Aj׿N#ir)̝8\UpW..'I2|$wduz(qnO?%`+XĚG#QtdÏw!U?^(S@?xdo%;~__ A^F'o}o/rt~,, twS._ށ:y2ʄj>¿GS'|,+\WF= -MrFBWgߡȫ8>1W U09aZ=`r;>۶7!go,ԫZ߽|^.*i9te&[W57'L|Idb8̒,J4ijL쵧ٟy_̆g5ygY_ ASw5%QuR/O\?bw~)(tȟ@Ct7[K#d1s>/' \Ì6_w+2 1\E8d: 7_lzs5 H:>ȼi)l'K}ZET/Tχ ׉w;;iićovMLS޴Hf丯I̽0 Og'{l\x*kUS;ݑ% /UC1E;s܅Tn\u Ø.0_ pnp=p@˱IfX>|kpA|YYS8Oa-|??4888ECxmN2bCSt?IiDH aT2%I- 5J1V!i$B?4,98\:-Eʽb$  , ib)8%J„1Q0b,(R -{ӥԈz/;\=٣m5U/NxX7iGGTz{4UrtQ4brsB)olVaFaqkf#jldK{պQb.!! >[OcƐ!i(5R̒ݨ[bᷣN8[{=q4?-#N ZK7z8_rӞl7ս&;]:CyZbzߍt`#\闼 bXE`4-99ţמ\&!ϰf I.vOzˏ^|wӋ~U,"X^t ,M{-{fo*,^فk&*#(/Nxq( xy?jR-+)1lV 0,Vx1FHj0R,ϋ#:Ӑ#ZLj:S&&tA8*z|8^ OPN09 }xsZAKy N{w݂ NSH z+.߲!C٭T\C%0#`r$d07MGa Z C3fcj9ԄvۿMQcEq`E+npek'Շ'})Ur5vpE5& ]:V`X$UXH!,G'39DnDb9l(#oXn'2åO)@{}(η1 wU@*?et0QTu=ciQѲKV LI517Y w+>G?DuG-8ʴd :D4q?߶A ⥄ήsv0!qWѹҪ-#K*#_U#g'JO L$'+ϑ\I,.hI9ek-bFȹ)i֝Dni*)i:5ώn͡alU^/8ꈍJjԼ6+0sEP*ܢ`3[ K9hBAPjqO%x븱7Ks>?{Ojm<~SDJX0,8B(#PuyNyD@􄋄FV)pm{OEP,OEyFI\IuX;UFő ^Bga|G؄qk ̂#/Y(V-.5nlC\N&]p'D06W ྫྷӺzmCr~ևDbޱA7HC{Rtl>Co\SmɤIF P;?i(7NvnIW ;V%5 R*[W֦.0RkC39#\6$7]S03jUnDYܒgI NpMD3Ͱ2St;}cC@ʚ#ƹQa`=qNp{[#p9d I:%P=u|OO`5ܱ;2NHxp<P⁋9(p(zs=shKfZܞz|⁋ R0H)Y*Mvq#$WxKUw},}^$;^'%pGu%5+׿lU"-}J*yJV^\]W#ۛ8PDrHFއ-b98F21^B}i&P_u\UMu0z 5K>eGnki.xi*&'h863'UM:{>ir $E=9 @\4Ymo7x ~Hc`&S#B)4_t9ӆԟ77ug MB0|69DOοj8:ʜU5H4),Ϗ`'P{7X0Xn4e?!eU*xm8^G?v85?mć4>4k)r: 9VBQx>/u&wUKJIS^N?rwý7{9Xg^4w&CN_5;9eBgw' 2SFIP{1C<[GQK+^3_t6 HJpiZzʽ^Xh# r0g`&4 qN bkڀù֔sz^5mg[;@ԍ7]t!n j>[z L]?MHzx{-i!C3CSV-?y=hns=@pٔajgL0192m獼Lmp#5ak@T\ bX`rxN鯐M(+"Mk3t5[V73{Kk=*U\\uase878uĘ6\KqnS`8(SaF^;% #-=hMywuM78ۛdߙm`fKᯩdI%o7zy i=nt 3y( VˌA)Xrʌe/+d<D葷UPa5/w]:ѣ{`ÉB/>v(ˏC=a)yӝ2vmElOtM" ~} _mM{ HfY$9HXuKf-[ 'ᐉ6Fq[cI1}h(c`؅#%G1#  ŀ>~m)k.%}_?Zj񹪚//s1gQw!6R\3,JGE*q0pԦWuť,.!#^IDt-IPˇ03TwP-帚 q`'J%4p83Jgw(-> v4,\ToQzijU TiP5z<՛RaJ)E{!74ZlH s)8_ToQzIaU#IR^Tu6@@Ҡ(*AU+ЋZqP){](%;r6T"BW^J!D,ik]H٭aXҲ^VjL &"VZ Jͻ`~H=-R 0=:!T\U|E{7eJb^JjQE^^`H ZHAޢz/V2]=^| -UP=` E{ע&;AکRc@)KUEGtj⡪-Uo;Q*R#Ov_0(o\#z4s:zۮssh} d-uimJ57x!箹Ӆ_K=ao{wWHݼ鿾UYOV]ݟ?6w?/W6i:.朥;Xtɍ֞kD)nM.ns|o;=B7.u`[?x,#Ki,U6/?/Kט+k֙f9g#xh !*T]%^:X-T&FMϺU5x( GuB y<4jd1< 5fjԭ՞o:?mؚ+jqoJm: m"Cd# Sh}]ԋn`U:ƅꝝT0`-Έ, K(2.G j7Kq rZ.AKmUSLis(]jEFClR`ҽҽq20 ="4_Ǎ,RFqfPN?mYFMߪ: f9|3nd9|c ^Mv1Oxnl"32ol"8f$,)i7"tMI! "!B`1O_]2f=I۠J5r}|rƾ:zdaheRy'~ØaVubh ?[E[7fAà <Ǘ@8lrcuWsV*SVDΐ#7,>1Ӹ/M"Ej֍ܔ"M-u_}hPW ZN&bH#:d7{5w8± đ9%:xjDƍ,DTxO|Tg]f񞞗zcw BN$:!N]Y o co&44 ,"}@'&q#9 R[ފRG)B!(ç/]$*%;32挐ÐeX%Fn Tn#٠i9UMl(E&Ba@ =dY Qٛ~x ^x~<ė~|^.b ҃q pY #T-EKLBDʑb\rbb}t Qlbek~ׯXFYdS YL5fc1DEq1KxeIS=G#& } xa3,(vK5T). d^Y@n.6ħŀ=vzJG%r450d~5qDqqJ".4b"yGoX @CPuS |r@n~)j c Y7o~,-:dbM)HF5b8%J!7)5`MPSvpJ^ ~@U|+7ޱpj|1 ,#bjfBp\y. zL#1>6SƎed9@N ٌ+ ȫ)Pm 0",>: v˰E lI$I"t}ҭjN}3, ۅ<}<]? x uw=I~wKRއv== ۃvhPm³&]2&jcx\I+{ŋpa k6S3Uc>}%~D}Lb`"-*X, &6QR)l8̠"pek!+ hm(>s<xPMplآ$R5~Jr9|_BF.4^E&ee z{!cМr1QD7懆"+JEkEI\g3ZXV|h&AKʶR. U4WCCCƆ0L0@74Η7uEWvhXe**y|~iB4԰gViA-Y_܋.rnݫN5+lMk K[aI06=כmLQ<͢|7 `lN :xV T,SU.Y+ms{ }k $/^x$Z!6"6Fr8YdDI{3ao' uWKwίFf4}8IҶO`Gd1C l:xr?-IfK|9M'~I㊠0 >BEW;@W֎Pͺ^1N%:= tL>a4#08ZC,c@v0ރ^Q[6Um )}Vs܂C3 ED$8=R+.? Z:}Atۓ}4 H%ANF#}{"T)|o(AG t*[9$?*F%2NC䢙##oxCVZ]+Yq4)Y EFf|-xn@s þTmĢ]RqsQw7yO}0ܰ_ۜ;XM0C `rQ|%KJeLHݜ"ކ0BEpHMD”Cm\N0cRwpaM\# `f5*weLփ.)cxusmn TxeV*Ns "i/My@;. i=UoϢeQUdw=48piA$+̶r)[PZ@ 6Hmk"؆5AHIF1DOPFQWEw4. 7: >&J U 1 kpysF=فV)ԋMhbr7o[zE\<[G RHWRGF)Ћ/tf0c (0=V85sIU$/{5s\~/fa-]+)c`a>B+F>0h?6?*]X+VJă.Ղ[6h?:J ~MFh$`:ϲNd$Z F_'iݚޭ<␼0r/ vzVs5F׷P6G#MIV`8W8?uxd7ttJ鮨`mqsȌlKඥJ>faﰪw|T 1gڙwzv9/M>!Ŝ>b,w~ y -m>̏~4/c c/~~dPq_Qon?i*ag:r,⌫mX$mdU< Lr+f#/Ė .C5.ƀ`1nqZԘ#G{WQv_Ms:7yOiv>oFh &~+wh ͠?P#<}܅0в ݶ1K+&"oB%b+4$62XLp6Ry!P[Y4YQ4+k^UZa"dZ5Ɣ#:G h 誈-%]5誉T@JJj,^k:qKs .E\؎l?e|CӠ8@MtJ.xn=) GXլuJbjt[I`jqc,M,%5&Pc9mWH]0Vpځњ Uevi "W$b'B *3!JGF2>}?U?PQD |+wvv~ [ݶ+l/_e&Uz si ^]2˹$⡐`H>~`}"> BB ʾI֖s+LYY9#xF:xFqe,KJ+; mbXhL&0zT7*T_?h0\S⳷{i=+BtlཕP"6sJ[k05\8 YKn7BԖх*}L-)p/-xd4v#޵\g*T}5هi{dZJ皼} 6<7ۀۻ.e8ͯܩV1ќ]?[%RD3FسF .\&{~|3UH f.S%{Gu@SoX:ZdɭTUX׽VM\SAJIApv2V#IZєXsȣ)5\ uj )%L4f4TՃzI,> (|$"w5 {/T5q5?ik]%(5Z,2pɼm%μzY2"=PUV]u1d&{F*HZ]-*z)p*OU*Yڧ,'q38(RgqPܚctxt&M jI=TILLqF3ۓ n7= θG^2C %6zv\^co\P' t2AK>&|hs/JvRf(Dɦ+WGSwM@;%tYu-O4dݏxiK1fN^4i.:Y38Ks&%hdo/Q1Kݸ `z9V._CQӗr]\: l0?N?D\w~_e(]4n(kI]A=E/cf4kNE擥g`̛Bn^2@/ɦR,]0 Fn.9Go\T\JB\=ˏj⼗J3^"d]vw| 헻bPO={=aPu?,+ P?o֟Z h7??AhD0_3rGhVBŸӏiG8DHE~Nr\]gjg%j6Q0&/vsFIRY3yu^ݺy{0Ũ7#"HuuUE"A6/~ `qu%$:"XszHӔGo]!ʼ ]]uNOX)Wwi{/3| wO܋.}|BcTԱ.Cz?bXç8H#ω柳R>0}$~Z!^ƃϑl7w"u{k;{p瀽>W/QZa怮^j! R\O:Co7r zs<&7} \)B͹L3,#[(*# uI7]ט}-U( v=HB*ٕPۼz[g?I,9Ҿ&Tr?؏ 29%:e̹ϔ%Q0A/UTFox"D' D(c B}rz1K\{dd=I<;YK撵X}؋C8g/ v7TbiN֫:"@& &_I͜&Zp/}KAA1@fp̩-d NH47RB1}"̒߼7GМwjgH:#C6u,M#:/j*d#?!jh*LqltbtK#ӺiP,%5Q {5O}6W͐ccu6d5Hu~)ܤYGkC$ge/X' G[$ǒa ŻpNf{nrKadk "J ց`:hze$}Ju;# !JD*MlVM*! 21J-)ag`Qy@>18p-aS.8zɎQ J:!V5P%$< c+S Bkk:Ρ}`w`1. `5 <GcR!HgݩPQctioRfD5 !)bi%̊ XyuJ"JQ8~sLUcdP-4#FРZ]iB@g5?ڿܡE{7qklJFqbp RT Hu`K󀋢,׮dIFPVB0ZKUfbxɓ]X-%G ]!赀J#H72ȇȒ10`%b~πxI1|PD9r[@q'*CqXxT& :5Q-Xw<P5.M$[!r$>]w u W4(b-VDke(;7T*d/T!H(*Hz0@=g*TqúqhA>Go6'PDITr:yxP 9NMcZ0pR;/YԠ12@QK&3, _b@U=&Xd C9q$F Q7Vnuw U r}V0ȏɈTăL6@iE!HG60|=4\F-ԲxӮ!= +cT}$kF @(b> ;g@B۽Z)Q+2PLG՞ғr8:Rw̤x[23x"BilԔ45fK/1rU.MyAC,Vcui&MP$9.  V J#9c ]aD"2B!3 T ۣ^ wogqQa<٧sЂv^g,((B #g ~g9w0s Uk OYfwD~q 9`wvڻ wJ:In\{3_=m\/kw_݄?\v;'~(ōE7V6=딜 P6 穂^ ZE|xG-Jy++;h9)X:,]j'β%" i#`EX6[Y`RƷolp9)ol$h+1@x+a) XK X"sXO"m,B0nF[_cvs +g_ gW^.o?a~s)V#P6VDi+`3%>q2pFwJ[ >ذ<|[ suZ }x r Xp>K>IJ';eHNO`u+1֭,[I]F&dټUiV/W%ɹo6 V8nr'. m%Dqt v8$VF6 ?Ȇ ']XJ^|r[ oVcJƌ (Ao,Vҭ,,QN|`96Fdq[ɉO,>R.D ~ so lq_zzzY.ų>ևs 'osO2+LӉߵ;*{76(66ƞؤ3ol tڭt t4=,i:(:eȮKIX $!``cۉĝ"t8iSW>lpA`i݌CT!(`%d:jȚ! [I]q66E͌l@@݅ 29u}Xn% 86_+5~N'毯~>}~o1e3)vc >|=7B4=Sn_^snZ"u#;q럨MvgkXw{nou޺[w{nou޺[w{nou޺[w{nou޺[w{nou޺1Mu;1n1dHoJ6ΟѨI_gF3Qnj:f1ųcFgj{۸_p(-(v&@A~hت呢$=-ǔDz4 g9$2uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uE,{D QRJB h 9vdD$ηHԁMuQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQx:0F>ݨ!FQ UO'ηHj:::::::::::::::::::::::s4D;9}"}pkv6ֻ}{ryWͶx(GyEk4ާ?^1\ ƣ0Vzg`;6l:'`R ;' Xm`L D%kUOA X %'`7Î;`!v0 }Yc%lA7dϭ>Csbwa9;UX[a Q5ɵb78A(o NgH2 f."Cĝ !9 k1BfvAat'KXP}.]0~P]S-ɻ@JYt7*2NFh~1x5O&8&E18x5%Y~z$36|gTi!n;lDcAtГޯk a'hQOF&!XośRI?URFmAH E0 X X* V}Q&bbOr`inQ=a& I7,#T+;\OU<>*[!{Ų'"0 %d'`qĴ/fHa b X.!=KĦ/j=9pK%E_tVԪ c{VAqxZ}" 3x$X~ל,QkWW/j߸'`RFERiEO:΁Bf2ʨ. VbفxD'`i೛ezVhFU_c*3tik׿_ӟk^mU5@/uUI8:t>ɳ+? ~d?Y s&, G3<\M`zV`M Xò]OzbhC}S `` 3w90}7ɾZ!sn([[yvg&Oo`d}wyD Gr4*%/*eU@Y2W s%0!)&:bhjgjW򜠝Rus3V}xsPŚUd Ӄ\`)poUc1Wn"2o\/۝+C1s sj|*4"o\pw쑹  p苹 h9vs(IIsG Kzc\AbZM\Jl$s-+[;'o;B•7k"(U=f){;Nfk|EF RvA'ﳚH9oN\p_ *$(~uy^VF d2M(m+$[_= X_7<<`K4fzְq>={7=ԎhjBvbmuS>多\5\=kNoDxs8Ϧbk)VpYv;$tVsn~>^% dWAyWF^8. :/nwSnۚu㦬֧zc{r] ?l׺8rk6Ѹu6ʖy6T7偳%f/)@ ,tiok ԞuZz=wif7ǻ1wQ3*"Qx%^ A *h)Ly+E+wʻdlr%2"p9PnE'$K٥xwS{ȋZa0SԾ˶@ b"jht;.i޳ƛe+S`F"c[r: =dT;0f6gWLɰo6̑n6Ro_8 ρok7uh/A³ r4w`?vGq{hF8OzW/djQ>\A_T#^K W]+++vdZW޼y΁KS_jUn:WDgheޏ7ڒYzݺM3$S6 L?׃k:嘿Uu6K"atD_/bˑ]KLt{7[C:(2 1QoF΍g ѥc !nm4xNݵC i3xhWƗscƓp:J'u~ucVO)_sjƤZfܩLs I- &AQ =}VW#v~d2Z*]^U_:4@֢4;M۷knF4v,ןs ӰyrU{}P-b1!ݟ߇Mɷ%[?/?r?6~S Nzɻj.T5x% |m _MށxYu66u16 w~/wrkgV nFՖp6%_ɨWnٶHlҶ]y>U,hHئ?kh޼ m][qa?n̏+ z8qStO1/guagd2 qg6xL,)X:1:g27"#*f:[Ebs?>Fy0U-F娂y\[/Lꯛ,yxj3s쿰xcAvBt0 ByQO|_rrS?Xmapmi=k(YH|V5=uQ6vEwW2-̓ ^R._OƋ+Pm[ƒv~.dxmg٢V`ZVj.yܪQ;4v:B 2h~3ސO|X-f~~Uawim} 4MUdT(IP ޔ7WE x+zdr9n?]نoPʪ[ ]l8$TbGΌlG,1Y Aƃ6;,Jpֿ]b.1܆*}@"5a8tO@.: Q3JLA]JZܲ%,e-KnYr˒[WvB{.);:9RXre V뼝M28ܠlv|ۿ"mI;[q8msXORN3Kəv*. Ft{[\Ʃ,5ZNc鰶fg\[QRcsc>zY&w:ce=$\b~ѤY~~O\Mŝ! !Kgjoښ"Ў[IfP6#vո}[l# O jRmW܋kRt+҉/֒!w⒧i{kdM@xܵ4T'0kYw;WBaQs|οZ>. ^W )1Ae;)PFp)K6n"19ko^E'nIO)Bkōw\' epx'&͜8ef*3diPeN+47&kKq;JWم,Qe|r N؏e$`a1"sR:k[J1[BKU\M۸IueCeIr0F^j B*V)%]n~7PqE-U8|* EZ#`/5&=Pz(=F%(rA{^1zf;qp.TF)3(,qhy&졁1&kKq;JWQb&˛ȴ.L(eB3Ĥ R;QX2&$*yW.<~\ es01{1-Ŀ_kqF`FPd8pdn6E9u_u=2uT"WQZe0+SQI#cEJ,QSWxL-%*psT"MDAY. 2s#H,g⌨ =Pz^.'ZzɁ_Y q!H d5%gCʲܒGj\c-J*6-)t@#Qk=9hW ͗-a6I EZZYnbH.:j|rchIr1TDt2G=viN>̥. 2ᖐaG Y^SI7j80Ľo&"✰Sl<偖UŠAeu]1 Kƺ~xS_үhb2o{x7ZO&򷧗 \䠗Go8-wz"N&o6'Qo.H~nmr  /8^q Of?/k?Ҧã~AIU^xM@tg%YtmQ.*h |t( Bʠ(u7Ԃ1R]YKWHqaUdI*;U.p:]I~S*;9FI߾oB1G.CX?>0[7;i٨ cq식3|hC'L*劗J :W!nYnj$Bdq8heV-+r3ȩo G m1eNgȥ#pi L[c)Cr3Ezx4BIIt$^]ՠ1]P r]bPqU8:"c^(Bk!?uBWHId)nt'FMnS(,.CQZ\Z|\ +؎~upT7r5,J( 8XdHQlp鞷3b#ʡ=CnL;Z>Vz:yX=> E.V[n%+Ik|d8q}Z8xm|iAǡ91i -fA=<*" O=:REãteS7ƫ:{xT.K)> ,5jrĠq}l1uЯ NsQF(s0'|02HHF'hao6Zw@>!GYP<+3Bs\DQ.I>nv[NSiSz:6TãB5-Ipf2TaI-t}xoI p!si*c-_iifPZrRKYKYWK Gԥ!v6=|+n-ӊK952 59ɠ e#zȏYl+5]{hȁQ,l b  7:@zxT jH8 {xԬ Q "Qm EpQUObtMg 4L_$x`5uI:}l4 F=C@-+r>]{`F;OAgg5MDAHn΁<#F{-+ G%yyyPxmZ>vG .z@o&1EE!=F3!҂p88:mG|-$Y,q?kwTf !6(ELڡhc-;|Fe|'K*X*j|`1[$ *.7ە}G~mҷ 3 LQ LϮTA:y F^EҶ~?Ӏ7(ãk"5 ͗[dYKB B4YB˘)r #B*/$VJThzҎGz!J6]6+vt> + 'm|J7DhMj4$[7&+IWF/ 4'\.7inQV8l/1hH1 :] mvE.I 5LU(8/4H ,X >Eaӯ.!`59A]9HO5rafEaMF8q~Fyqbg 5K[ɖMB1Ҥŀ gkL}I7b5G'PM.XѮ\:*ffWD;<:!|Hm'Q@G"?{]pG2MjkOAkƠcE[&Q,-wyPo-g"}5P4F*0lBTKns@m(*=.H6aM1~*r!qWUbOxG(qSrVi3r-*(~Y~/!8?HA9?ɍ g4dž$PjgS潎2K5v`s/()i1ׄA{qJnv࢓y|P4{̧ Ox,>‰Rdd9PI߸^X9"?vfu4^|ʄهQ$eyJ^.&܁MAKzl{l5A*[D,NLf&Ew]戗@w>N&#UZ4"&ǚpꜯcRN-߽N]CAӁs#ns\RJY zbk)Hkt<?|/[X'w2'gwNjI;#2ZwcnHk{Agn17ʊ y}c0{DQŴr30No1~;צ@1ָTiEq/E yi[ZSš}odkfG[um/WhbmDyZsnz[#j(\E$؋$װDT X!>׿4 Q (b, j+({9WF.=-JKk!K)K"' .@'VYWvITҖQbwH!CPA;7-8~ی'&CTJVD+];0%}Y̤ BGE]&pbīĴ u""5Mo?(>Z?EȰO1}>$PyT5'u]%u5ȕ$ַC/;$+0-oCû9hS?PkLV70{aZs.2Hi>ăqKd;,fK*Ψup{5@ַ^P q/Xfy)5NJ ,XP jƸ75YoRef \(^vG ZЫLX3 $dR$R].x$[}Ƨzc"Z11}0C1BS (I^H+z^۫`M&O_1oB_rqUbM$mU`" /*C%nElmf#8ll qnmBxT"JCM$4GL;OMu qxpJ6$o9ܟ<&O/7/t%Jܩ[dz;I\,ꂑ,$H-$k,ܖ2v[WE; ෵0Xws'+*|Q^Μ |259]\5S$p)D\p0'V~m=1he) ?ع)tJA_. W8blpDK02(ejNruM*' P)g. B?Mj>nh'"__v>7'WqV3) ઢKq2;-qp"3(w'발#;"^dLK&;b^>~‚H'{C\*ٰ DJ\Փ9M`~G!<(W,H'd&WNQQDۢ!DCD,gLRydNA(+SE-G~9ణ.c)(s9'm.bÓTA+"DxJΉ$%iSKi]+r뛛';t!ߜ?qA~Z!heV",1jՆ.U ַ2M{Ŭ1n.{c\ož-^QԔ3Pޏ'*dR'5zLœG{ TZk[B Iw 0A{Qm=`#Z!Zߦa>^9۳rHA7|)gUIz~L9LChQwx2n R8ܲbӺXj9.Z?, ɄtoKF5_J6]jJc2/|A ߝ8҆)ya׏b6yY, /y`͡r!=u5B5JBK= W,8jX'eB)u#ҖH qJ/}3ô"/0ǘ:Qϼ騋gc5I.,mg9& +w(Zm7~uqHCyɌbTyιV_+4u⥧Q!mƚq]QC|ALjuT9 ~ga{*!\ϸ\`QkPy2a~;bʓQu8AP9טנ'D::+ޚ/ mn /PiiA-h{GwYI.^ jpN<zZLcؘ6 л8hNs 8USPuwz.ݴA'u RB=LrZjP9yʁ^%ކjp.ifL".-nyi(.*t7arŎ/.y?+fQz%>bI]i1Y3a(u?I9WƵ1X1Ԋ X321oo_Afocrp`$$KERUt4YH˟6q!kCG m=,bډ1*s ȅ*NU &ɸ `E?T7pzmH m 1Ra-y BG ɧoQC7\(y4ǥi[4ںGZb, MfP7avNq+اy;U1vNhD;W͜c?w9=?滫4'4rMR^7J*bivi&ۘL +9ApﮠO+YBwk@0UnWŇqݒͿ>9wHSH/_}ǭM/ѷiE?8]^٤lHnp5Uimaػ{.?Ah6*2ZM&?by)ׄr/Þ'y|yiz; Ht K/L qӸ06V0,zi (1_v3l pf^uO~^6v"MlW&M@Db= sr/_մ_|\7B)hA8"Lm%ӵ*pXŜ42nY ӏq62fYNv @wٻ}ƛ:wE7a^>nn*/OzCk:산Tæש ?.|Π?̽o^7wӕomjǜ4뉩:I׿-S_]ߧ>~/r)ժc~zabھb\blê3چᦗDmy9XPofR7@SS5#cu}dޞ&7dk)UfC^Ե^m[Y7fzжśׯ7(qFɳ.rCQ}7#XuoI5?9vr%.زuYxޒɸzF! 0j.,QLka+kYFD!DXU L:Q7^= \ z4:1Q9)&8r"sBK!MCeXǔOTD(~ TKQA +v,p( *vʣ~2ݐ6'`*qA~Lf+ate=h)%߼_it?+KO0 Ͽu 5-u% GT LnІcG]9t&Ɠ#nog wUsSs,~`W#Pɭh6̨ ǔ| ?{xԮhFО \ "tMg2Oַ"ǫ R;rSӟ{}NUm=*^X!txI 0EAǹGW.73Gв"C}_X~$N]vԉ~>:c=-q»H3eޫдoF*;:7@hG"oۧ[>oB|8s˨ya\9p [''|LMDZ:j!oxx3?|nE[sa' w7,w:p![4xʸ|.Ue\:Y$-p w?nH_i 1&{1u dFk1bwKZan 0&bY^5qBmq Aä*,<#t%+P-?%7ٳ1?p9x=Dwĝ?xB_By.~tdYQ5D\?_s+xYY0FG;|Ì}q_M-޿["=[,0q:_L9QpյWq5inM?@& 6h Z /}oicF]_HKz?<0myh:y-_&(4y׮UET4_oZVbVod\Cŷ^Fݪ%Tmɦj2ʥ,`6յ+/>?eu6Z<:+s8yiTӊ9O@݃rK)KB|z7Ai!zK52#A9BɃ*B7J!TIľ=wt,z $P_$BW1h%A&5uvX~6Q9 i l`wML2.M[&Yiq<0TV11r?c]Ҭ6Nئ'/͚!&VA64-c4cA$7l-D~ZQ!ty5Ob|(CΫ$12[ӑ`mDW?} (WѤD_ϗpHu Ԍ y.P 'e :HB}]RTacLH+&Iոt\fe9Bu ~KWPIS !~y!D*c4<*U1 p<\X5DŽT۰1ȕ3s|ms@MGA L_PTAZ2MiQ%`4  8zm(^BB|p6U2ZrKʑEWG7E=r/rͷia5s>yc1L|¸q9&&>,enPτ:38ukd }nQ)$!6I>J=/{xd VcCeS/ ΥgFH]2Ei1R+0a W9N)NB|GJl⃙w@outn06AF -Q()+FOjXbu?,".WCsܼh{_~Ec`c@kB!ݱ ]@r d Tw*fy7HZ7Fy;R^\]F!5X*5^RY Ccއ+ j.j.8U hD;6U}1mRbsJ%9Cl*_DA VJZMcd>wɋ^#%!ͥ.ʆA"mx%K10ԜAW a`T.-h2?k"R1JBq1??ݤ~Lӏ #멭+F=ȴcZ}תϥie.O|d͗l0ogPjVn,gqD39qe٩EN#o')3 K/wHyE*IOTaw1])vJaA~p ,WI}{3Z)%v'p)G)rVi8knOUyvs~ &`'VycNcpZsb){, 1|0]L>v@hIi^ZbJ")O)-S2&6gӶKOH93i25a,B huXʒ:=џ(&sH0m-#}̄[3izKʷk)K'ES:k&0]!"[-+*G%Mdeccj$eqXf/gc a߆ 9Z-Z-ppnD/KHIJjV2U5@j``DzSz0.+` qL5Dn6XZaRkFcX͜Yp vY3r`5zwO E33f]ۋ?߾}1L+F#3-מRbGDo^WgSrW-oc L4w~~AnQSEhMƏe&8^4rwHj8J|q[ϯu{н);>߼|p/ܛo6(ߋhyS&5w;YPu⯛.[}sۡAZ>}3h3;7o$*7e2@V܏W_:w5fD%u|?Cƒ c#HkXNJJd 1v\1p$ME6y݆͋8dy-jnGxgLBPa*oh2ZfY?}obnX]Q>EbIF) \K:$Afl>y?t{cX71[v=+ifu >[4خaxq?ѲmDP+mxp(5īU)O6-׾{۲e3_MwzlEmNoji<U:i2np9_GZ;Ѣj}nlzv+{Y]mg7_$D3_?~{ z{qDׇJ(<1&Vܻ뢵*3͵z-rtV,3"b)" Y.g+L%񵖌^A_DŽ2FXeXV>u\6L0s)iFI)3S"-ݎU*HJ[e劖Tp,T,iwpSO/I1>2# `-2kQ=@6CTs*j:RjKf(q "ɲ?- D]tJ5JBMrūiLT5E1:2j*17ָ,ugKAT mq/Z_r]c]ɆߊY@>-;Ow.2X[<3 .=fd 5.VXU9YGlCwϴ8!@=Ac$Ϸv_ஸe1N7^ }z! K?sӶ>phSY-uT 3N!Ǧ(2k2fsB`dIUdBiC!Hɉ'|̅N,-@M)wKDXͨȸPvRPg300N\:n #wK>%i׺c,QT܊*23ǤWxreY%Uo˰%dSj9*dו\)#a Ș8*rΎpI!dTēa+o.{.CHr/,e@y tsNPa"굏mB}N)SWd +\6`",!47 ? ԄeJl1& 0n7J7SOC/ϡY9AUPP0Uq({ NJp'V97B*}v`aR.'5hXB hX LdiK%7Ąx8v><̃/,<Be J/ݙc7LVIElWI`D9Ι#oJoC:9̔K K)s;I!>e'0>sov)ll!cIY?g`ȭ#6 x9JT4DMW%0kIBπ?[!lFh^.z1hkD5[m>X!t h܌m)hrMlچ}GupJ*~NjVhj8!T|{ K2@Y[ܘȤbJɲhZS;Ä{xMO<<>L 뢘mqyc^Mw:<ωquPĮ^3pt* Q{MW<.^5j3)^nߴ{Y UMV7Vɢ3|ervx龁-ūj2검(>NW "l'E |$ؒS?V{C 2hLPd=TJ †2x57-F_?.[cogkG˹w7&:z^5twGeF`pLUfYm8%N**c8W\Z׊ 5\_9+52n7-O‚.&jH`QDhm'G]rs^  @lGlqb /룫X`#Dꊸ&!!KPc Ў2;fCd@:Ք (N6G>b6Q8D#+cN`9B2")3cXxH8K6JI>2 )ebAD1\ɂ!p0",0 E* A0ֆgZ.GC:(gB&0QIa@WT*tY݋QX@K w afs#D|^3&%Tat.Yk:O:Hcy^ӭ.@tI]*f@IT.:KX0um 覫2k&x љ0 IKQb1k]֐繨) 8. n `HyCWՠK 3| gh<%<`rHPbT<4 -s A^.2S:G D^Z RЃ!V#Q0a2'+Np " iA7yMeeÈ N((! RXFEOZ^:@!sď!3AE Yq2"Vl,2As#SHN 5bAP/ЙI<(0SA"ՔqO5jyA[:xYk}(UXkD0p? A,gfC8 dFhC/j@[ 4SpFX+*G3h)5ЕB$`IPb.`q?VwLZ6l!& [jK'`ebwJI0I t!ĒC;0.mIҒ*!ׅ5NW$$XX3# #]aZ6wqb`KO `Ajb5 .N(]muvV (UI 'l+rǺoT{7_!ڽ_'vg>o'p믭?=Y}QK kw 7^|F]n8!dR>_֍^bM*9M x0x'L=\ ~rJo|2IxtF \igyvΪhoa=Y)3+b*i $)?5+4~ߔi}&N푫ʛggSyoɰ~ xlYH䖿Hח\;qa2ӭYp'/{'X[y^lZQ8n=5Ӣ-ّ ף=/{ !vV|3xsM^ PuQL,&ͫ+=:9n StՇ.}f1o=-V.Ŵ6}b-(vqle Ҵl]=+eCYZ.8o%C) H'PM076g^a%I UnS3Lל0d,_nptȍ]ŲNSL&JcC |{lOh[OݵuYOn*۰\?b'㸞ΌWi6^-5b2XfnwށCN:.f7+|2)fra~)2ՒH(O7 t e^EoJC+~sqi.O?΁iI~ZZq1 f#~#~XzE/lf|QS̽EC{E)X)6C/:qKwSH뽆p;JOu':4lb DmqAAmeW}YXmXCz,{l8Idyac*8dhG3^>2 K}~;}6xǛJgK;3e79jytAQ{]J0&'ey'&n>$|9;^}ĝegX嬉! /Zrnq z@PwFew\o7}\x͍{ͅyUݨ*G~HY| 6a?q v]e]ҳṋS,[`6on/zdk}GW/8և,0̝jriBTՌҨ|UU^~GU^uC{#>9=A[mݮ~"k}_BN>.\Q YK!оe@Lݺ#9딫fCz%2s9VA|w!/fb"|__-9NxkEoʇ::6PxՔ*ulpMtF悭qP[:HՂ(Dc[hc`̓q\>6.Hx>.*y uQ&(Vg4E回wpmxY%v>'}9]v_z2'I?BVtLΝau~V>әLK9T_ems,-de4Jѧ2-0|'L06-G]tO7eK߭ͯS'E2Rn4zclKBg64wSJ\L} []/:#8sw}~sV-Ȟw|E! qrS;B]eM<1PX֗PH<]J^_N GPBrdR6H7#f-ח,L9~;[#! m҆x pֿ7;){ '$k,ݹ^fq3emZQ0F,n`6t~*h&&!LcԦ2?g'Hƹ Zeȵ:[6(k0g J%pqopL1fzR;{HO 2o&v6X[ZeY%,M,vfW*s' KzxGIe6r4BB7MeVIY(]p)k))IH%ш&~1B˖z Fgp1%u ʌuv}ą3GZGgH}f>w!Ko̲i[7|@TCJ0ȅs^OLU ZŠOTQ^5/r$jgh@ F2Qhg7f6PGk/"AcI]Ӫ=K )!;xK<또]y"$Aq[ tF e"e;-I192(h@[2]9g:.b^ÇS9SERW^tިтuFT^jfʄ&6tmԼ\/DJ\섧Cϝ5A1Uz#xk5'N(QkF Z`MgTkR1j~jyEb}nK^aXA@fAhȝvZˬ ?ZmoKc^ƠOKJhiM \"SKG!{EL R|a U[k4ϱ3xqO6MؠT ٹ}w Y|_Yz{kW9#F}-/yVVJ[{+yo%何VJ[{+yo%何VJ[{+yo%何VJ[{+yo%何VJ[{+yo%何VJ[{+yo% [ߗA}'tTsNnnjjH|ӭhėZ}Vr*$v|zx9m̟-ߦxo8 qj 5?I- 4 ROBAt7@덨{-}KWZ*2ݫO=Z\yb?q~uȇߣew.;fq}'Uu1wq#'ӧ?¨2b'1K[zu/ojw ~:k>]=k.ϕͭ.bA)x%(/ņ4|X|Z,-0=?BOӘ3m̹>V Jl_{i=nhHpm@hw 0E/{"X. b e|};>ݴkCOaq/?o\]9?|?fcx?o\!ƛo啣Ȩ-gu&x?NtWV:s|_GG]wO^ArpTE\3qc'S # ;h{;4˯~cWFbLGGڼ I^ǴXuNwn.V_ʍ 偧wdysrwYNmyO}v7rv$y D=scStu Ϸ+bo,ƼXxCŹymxI|j>ILhtl Yfgv*s)L3N.ԣOZ?<0!lo:?xPrzXF:Q}$aڅArovH6~շp9F~Uy wrss.hߵ+F9o0[`(bt0p[ ^UM qTBi fb]×V]7Df.9JMmWYe![ gYB'-*8y~WFD )ElgZ䴍3+ rHY{3`td 1 Z {3:=xr~t}t̜:ʩ@{ފFPTx(KZrt7 =v]2w`W7$X5i0ajr7˴Og* /aȅA{҉s4XKO`~/T?[Qq!$)޸/0JU^r̨_%f~o-0f-gXB13~38؅ |ThPBnz `Re@tf/o: rs/B/į%_p/S<'N0ݑ Mަ>E#)ȅƽǥLy {u SDH0ZD҅ y&2pqLvxbqҁf @G,`p>2(V>1gE:57 ̯DR䌗xqpVJ s'7YlvJ{ *2w73-*-)/K.NDdN2Od@28d`'`7yۚ{K)МZQPH땔Ypʘ~CՈDz 8iY ɀ dpx]/d=GC=ƃISMPf6s(9eNZ%"c'#z)A9LILB ^b)-v2u&'aºb6-w^os7GFC{ݠ\F\S"6ל_LBɂpG}6+ vk+DSzc2i?M1=uDn,;I”.Y'N28l-g4Y.ɡDNmOdb68^4>.纋yX&M`IBJΙCD6^ qЅ5q\q?@*È$ʸbIΑQt8 1js&e_^6E7(Xѵ9?*2n9mvG SfI*6`n4LE 쭋 BIG|ijJ @s$p\D:eEpH I2HXT,'2cA_ 'hJ2 @O?dҲCg:%@|K) ctΤ:0#I3Vx#UFA)f'afz'iIФ<WpOW~yms/ }T#S9p>2(p[D阋aRsq5ݨiu"v#: s) a ?DΨ/Stb3= ][g[hJ_,g)pPkSYBУ:Mz4B?x,!nG;o"^k8Z82Fd ӡpx\=xg~5RLzH?_?lGg6l2QkZmbՙiUw/8eOI%pB KVem*)_DF6-@Ǩ.ț7q ~X>,-<dhx:"42]QŰ^1 }"Y-8őM.M1M.Ls!oۚXh͝{[ mhRoc /EFAO@#ތT(w8Gp*%f";@Ch Peb s^r~t(0wm*{ܮ,9Z3ЛP8:8aES|4>3 0evn*k@$]}Sg]=*KII_Gxx2_v}N @CQtMɔՑQ ;WM_l/2g5OtcnMŭn`avN#X-&2qgjN}E?6#cydHKLoFܔ0x@28CzIf KFcanռCyFѵ落~jAWER'CD(27fțc^k#x{}d\:'7L[?;k:|('ٿ ?o#-d ӃN}|-U4UDti<8鹧03Z- {Om~33 @ϲtp V'T$OM5cƑ CpO[P@?rق6.1׷GE| ޸?Qg[g$5Iϐgr:8!D'6@Jfr<5aI"~{؎jd@28xrZ.G7uZZ7p7~^9 M5ɨ,,7M # IHQMIQ&YZHdhz̛t7B,K+j;0v O|URj-'ƊPgj iXp I0 &n^iVvRgkŘ2QZtHjiiM|s-TFÞWy6l+:޼c@d`aը+Z?w' Wl1T<I a]. o?۲5-X 9e#, y# }i 2~{to֍ UՐA "\ GOoOEZ6e7q?=t&^FU#Ϥ1ʅL]oABRLHy'dݧ4wbo/Z$6ϤEfo glaF(:E*}7@L+ eeea^j'y:oZ7^4HxsK7>m?Y?n6ou|owY~ľ}z?ookmFapla ۛI  MȒWnImIZ-ق1"z|Gm1`-Xvm/؜ߎ|;rks ?x=LV90 )6W'\{i̾dƟ4}yRvs&7fw DDHJԇ U?UE? /^]ʇx%l$ `Q2׋_w_HuRV~^yWIMJKߎyRLCɛIJt[$NpN1<MII#;s%t,ST {Af vؙy;NQ(Ò#[>CR,s2HR/m?wݡνBR&w;  @gKw-C^1x4)?ܿÛ/?8l^ '`0Jfb ޗjoޚ7!l8-!"fNy&H%GbkěMs2sW"0,F1`"NJ]U_9J TV`f~{OL/RΨB!.7Rb.FD'yVUF!ĈK4SanHtr7sx0&RO$X`?.w{vyu4`&c ' 0*K l-LuԮm&+_k7JzYƷW+ݩQig,(2}`)%Ťyeи!#Z-DB Jx w4BҴN.Qmgsې|EOIq>>T!z5b 3m`Ľˆ!"c0Bې !m^ޣp P# FrZ跧*P0=0@Uڹ' P{7SЃQ915.YrÜM3^Xc ,usd "yDEĺDk)Q%CtH#Sh]rۆZ䶍o-Svw)b\>)ŻY.T0vܑ.UkI$!" ~(h.RȾH>Wqv٧7W/JE}%01NG &P('HI$F;K D@m hcԶV8msҁ| ?3ی`ٙmF~橉4O6R383|tf[p Lwa}ɻ۸lՄ-w.\"2 pʬQ6>sy4kosy`]o4OƔ#z9PG׼Ɇ&k;"@~8!\x*bhPh&eɣ'|/XCEm4Ga%$Ftvz>O׽qdK %?~x\Wx|ExR 9%@9O@o69ݥV"H'۬y@[l'x+~yvdXAt:]b!| CR(#h#V^Vom@@X-5n߮3m{=o眩LdO)'Wi"&d0&Gژ'|ݕwS F%ijFICKNjzrs-fuM79&ʃgzBҩV@ҫt]KsŻ>N/u +jyQZ?{ 6QhjZvtj1R0Fo,,$DaFaLyF6֛6/'g'o"BmXdb>C)jķE)z۫PkN%7kPO5XcY g.huD! AFfm֭XHRha+iWBsTj鰆k $0Uc,ܸ?ߑ2; R"9& \YFp~[&lK΄,iMnmo3 X:$`3N#$ Evc!2,!!P`o4M!2 QL X2Z ]Lb!O( ,7Rqm8+^l/vm!#'4vmM~VTRu$lsSd`݂CQV(xj3.@42%@4S!~Lͤ{40!l ^Jt+ FKALhT"Ʈ6L~f9~Q˞$βdf쫂k'$W'7E_N_Niy䶗_]B?˚g5G_5 ̏Φ eRq`aVL#;le BNJ>rb@{`vvۯ-_LYvMA|>ϾX9ܘK;Ӱ+/SJΓKYf_M K86Y YNAW屢q?vG9z<đ>?J@^J =~<+)b|J| H}sjGAhhj;L5G0L =X.ox퀢1RM*]q1YϻEf-X!̶4QfIM-aE=V)c?(3Voоa <ͨwW5墣)z;vYjn /RW=T2t-Lĝ<@le:^ 62(S,ڠQ,%! j@ C _jc\cƕ/ٻƛd&šbN1O*K=7+8J5F.8^g+ 1({Lzދ ?<^%^E-z}A*9&r6>Csw ,Ƹ#aRbD h ]f*BGf*xp˾<n'm97y@FLJr@'h`EVH.;ZI"ROо)-zmrDutX1:n .ݦơlmZlv2hzQۣc,P᲌IPxu&%%\xbXZ̙%2YM+4ʫE!kiٹo4۶U>xkn G^"}m09(85γU _?u0nV)o$GDy`%.7gž^dt4v` |{y7M׷g?3վ K^ģ(.\ %7֕F{CJcXh4?es{ _xt/rWY2MkbfzS0SrD|= ތ'VoC+VB;Kk/Ң+ݿm6lQozz &:+|瑙"xraj$-P-(jmZ54VG| ,Q3hQ1|9Y5:{f:*+lb 1="MWn'uDmR=@(y.l\b"͇>Y)[.Ƥ%v3MR99IU8A vLB>+(Iڡ _S2DVUwLRLHn{4V0*^Tef[䥘"Ǿ;,XvJ5g׶諾v4Is@90WvMt07 r{͎Mx[Cǎxaϸ|q'q3\Q]v+yK[_ttѓ ^0/h!<Ni=2!^,& h8\ssm1(.=CrG,LXPRΰ |(XF4U|s2(.;eKSzȲ6eYrMbq.\GtPH"dh՞:#,RFXp4%AwXOz6 _nL׋_KfV^?@#A=%\=O-4` L1^m?omE^[ Jfb=>zYZjjtx]C9=9~T q1ݤ-Woj]ԭ325# sq畖塜Nw^}z6؛p=Y/L+O}>~tw8{Z>TOR, X`^gC~!%|)BGIkr3-`Rb )h$Oa2Q jy 2M GP"4%0Lغ"U;p#,LUڌmAq^{W2}nsO7~b^!B  "Cb ! b$ia98fQIP/\52K7%Ѯ֦r=kQZaVl٠7޼J9#&9AA~s}$90{sKӫ]GZb5 ^fh˗ E3xYڡWGVA)^X)7ؐ凎Czn'}zK vqIF]brʭmi4-RQ]D7+6c3SI[OmM!i`:&N egSަǔ2Ť!*Jqbz]Zsgn>4ӸL+z 4e}aT7R(Z ֒iTJ' ~}_JAں?[FS$41=KPKKLY,C"+*ݩv[Ƣc\WZY>#V|{swւj0/U4WzJoi05:Zm۳l`{Ztcmr-WIX%=rQ:&Rƛ[fZ4yp•Mv?w|W2\ஃ΃s?>zf 3S&@гmVXԧ o:I:>>g)7w0K7O@_UW,KUuݧQz`ч)LҢhE0 n'U9dWoGwlyIP~Ϻ}>^BV'R+@ {n߃)~&}'bdVLRz5N1lr9SXvL۟8܀S(Giy+eлk hW4t,mR4=G0{3HN<y1\"X#XPƹ\)XNޫ~Oۍ/%=h-r+\"dvWdU Yw،\K~&7Y뤒Dӈ5爳T{D YE9Z `l)'MwwBeOS5I>̮Oə ٫m͏kk%X*Ys"4y(H_2^͟^$N }E҃Sv"yj$DAkA{#('=H]Bi!42(d4جK*Ljd<)V2""&ZH0<)c"ҙs g}iKuɿܝC=W4.gS3ǮvUUoi涮n%%3QyO&`/YA5Gt&qdm^RP:%n̪NcF1eeF]Yo#Ir+l`bG5@ðgv`{<>yAVZRݑERjUʯ2Ȉ2FaK!T4Po͆3wYoJٓ}uW/nb߯聰 Iկ(F}-:\^K _| R@ط v8?9R0ǥb~^7  5'e06[KH5$\RoofU0(M3:`r-4>i\\yp vVlŠr)e,?~s;SNm N޲aыj%pf㞑KFw\Y:Y^ qdJ ԉOR(WJ䊒Cb7H`ebQ1ZWܹs!T5;@PZ~1I!Nc)4z^ p=záp=a Cj]O$%0Hf5w5B 30PB2[$ rVFl$:B=x$ψսNk"`r[[3 +q$2B 5;5x!M59*hZoe@ x*X[c Du;ch%H!G\)Ik%+,d(bVɈ)DQPhƘBxD|4W@L^S!DЎʧPh^|d2gҚWHy9DP#q)͏Pd́)4\u) -zs,shb-h:b[CZq)lb#u0Γ)d{d5"*dk_)U.^%󅇘yæ7X)Ʌ%j)qBCPl*'Yhlz5θE2q|gmןP1.INC/$[ݺSu+wt~WE"36ۢŲZSEpSRzfS*:j&YμMwcLe{Hأ FNh&5+Z.moɿqdsVNǪKNh϶E ,H+ IeEot}C TvG2x @֗'J4fvѓ=X%a kwDߥty 9= +rηݿxdR,K Nk}筏y [eY͋G|T&f\gaB3Jnjk-@d#mBUt]"~]ʽ$)w#%]|9jy7њ3 7x@RI7z1UZ54yA }^R Xo+GJF<̱F %B$fr?Fz<]XSSHYh9JS--I&Dx%Lxd1tYRt`ߧ+ tjNzVӳg:\R:Om^n zdIdxkw}D@b ua7GwGwG?7?_z'Olģ_H.Z~GL/[sWJb^>m :X+6/hr+,GRD"2AZz=e;M*E^g-I,mbwP= @߯'慄u7Ww/:栌 ^1~' }Za rN )G\3˒/; һ;dz36dadDhs1Jjr)XТkt\ ¼/7MȌ151Eq1b%}V~JQRIkɣ*c gETz/k//Iuϗ_'D,ڝoZC;=:>PK{Q_WT*;))X ֦" VT\ ɖ䤹s-4[Q坻ePsm}q֨Ove7z~.~;=Yw[YFX}!)\DAc2%Sȶg.%¢-d ˁe*d`2HFx3̢YبI>wemI~Yؑ !qvb_}J\SBJv[=?4!c+BEҊIۧZ U"@}ŝE}b{lL-4v|`+!ίhɕ Uj]Qm-┕K\ ?8v~!5 &МxLk0iICnyx-tY>\LT1A9fwY6  4qѲ3&H|2 Y]ٻ٢Eqw ь5^Y9mp7gizñXGcI.kQ${$kZSȣ/D*9-%DRsǯ`uk+l2}*S9wuzԕ.l\iJ]erE]ej:vuTר4z"O}q0?~nœ$<7(`*.W˘ '?k0UbDƐu4, .cʍh؝̲\h YަDw0XϲC#d|J☿vFWWR76]4Sb?bF8|?h[)te5Hdԙ3!Xto/Q␡?E=ŔRG(\xe7YbS./WƿK4v2"-yؒ-yؒ-yؒ-yؒXGZ(R+t%(] JWҕt%(] JWhB4u|]` w~yZ]` w.0|]` w.mmZШf3sPF? y2/@ȼX\̋I+@ȼ xcjoCמq~4 `oxut+H9wX#+cC1Ʀ*WJ^TY}FVk|6'[ ZlĚ誵{5ˇmҺM($5H.-H-ca;J. lk4{3p(IXhDFeL&PQRP5}t7qN82EkӞə0"-֞[ %>S(QiJ vLX~a77ڡQ*{s}13=dPlUz J9k)r$T&2:O :V|WA(dԨ{CEBG Y6rȠ5ɡg:EKbLxl9%A;qΈAZ+_%.O1ί+g٤gɷ, !ZָgE՝H¥&*+Ur_QhڏܟhDe!ElWӕ[X%7ZOЮ1*{arz3r-nHן1YؿS*ǁ 1tl֑sekE2%oM5ѻwחW?A`ro5 Уif6YqHgwT9~{xx3u_O~?b21]Y1#^b\Yu}6!+V V.W-V/gO(!̻OAw;ݚrUF֯:}UzVCQVDGH^X m0ySzB;q&dIᤨzL*'=ݫ dǿه?!黳ξ2}g~O\G/9"$k7,-iji\o4װ҂5ԓ f]#XwY|ܑX;]?;[k7|?tG>C:Nٟt;5+f+~ (c3m WSEw' hC<Xqz >Z鹮/lٜ)uSO485F]ulyV%&p gmEx #њ fe~hG:8ڇG-$hHR/JMsaUFir n)Nݡ#S\բ1v&`sIԲM.+H|o:/)z$IBձtk'5e75n6ߓx:X X /^=`Vk[ݷbDS}/$[f?E]ph:J + -h"4(tZTj#IѸ4d #N:@ԩlGG>²FD1(Cznl $ 2& 7QMK9EQ'ְ9D= BugTeC_O=U?||(&3+#s樂>]g;4hy8)* Μ$+OּRs ʼnU;4M9$9G;"&˙4i _l¢ >P1@C1jR @&rT@ 3M[XB8êeaK5eK6fy>R=6%`PY R!P4 &^+ѣ鮦WX@lP]E]pQg*wl^ zn3vy*}z]#az:np´3CӤyl ̜.+rS(fo!GM>, W-/-jrs t<\w$fϺH :}f*Bq`X@e jG&u6m6wf݆'Ӷ ez2hy `hcr !&|qdnG2Lɘwtg٘rŞI) <;oG~tke&Lw+OWld#zo DŽw2c1Sgo,4=' Sh Uh$\= r-T I%ngٖ8-تdɹ_TYRbWIUd<$j$"ʿ]A,$@\Hb/(g XPc^Q#!EbB}vA;*^S_+ _%EQ+ MD;В?0?'kWS1J7i \WYp.fYpnfYpN5e dfYpnfYpnfYpn}R*w}3w7s}3w]s}3MoF}of̌o)r}3w77s}3w7s}3wZTofofCsJ t8>_2*1iT"eR%c6(SZ8S!L:"Aru_ KE\JP0(^:3?ÙC0:+$WZIiy?4~4xkSn;hwew$Yk>BPd0חtafz^yޱGmAKTx3-kx-NHՓDS2dkI:* ZJ|\e2"\oM8)Mx?꼗0JU-D!t6CJDk}vS0:V=UB;q[q9=;=FD{ۈG~KKY{EOkEiރ\ECԱ͢o52 ["b Ӄtădㆣ۩r+@x9VFi^<4NLGp-zg՘IDk1hFs+%o&#6r P2QÚRBL1ILcHe#2:"jxT~8!,$)OmR"庎 ,.ƕq:W8՛K>Dgtrf45Z} ni춽A[3\ɜ2??KSɮ};B}3̌ S\-p_*Ffd j0DR]K.7k)Qce!X, ^dQbz\ u* g^y#~4s7>ɕ43 j-$I4* %^FĐґ0хT]d+Ζ6xJ:QY1sdr4H`&$h >rXPp.82X'cZadZ(QHLd$&F:łKs&5\dN˜9mvR寮\XVI\'uiab+p.Rm-5dKL1q/9HvUtPt_zЛuR찈5爳TF{D Y}rJk'۔e<2-j7rZ1e)~[ 7VpHЕ#[?tUŹevU o潢[q<&.G^zs3= ͟n/>4 {27j8XU'biΗ?uۊ\[M߽S}:qxٙPs+M'fW[7l9r)ſd3@qMDtJ 8xM\uD`RGT&oKf(LŤM0&*d%ŷNLV:3i@vO`V]᪥Nymִiq%\Ӛg"eP\ZvUbU*k[jL]TPn Ž "fW.QW%\WS6=a|4{Cxx\$M Rk挰8-94b^YF4uIQR!06NB01p) g!_w, Vdx=vMnvSlpGR[m3w1dt4-fT{Q(aREu)B(]zQW">.R")9IEdUv_afXTqXG$[pNJp/ Ե7>C3k]!G=(KTL3"Êl+ " lDZ "(RL!kMm 0@1r q@F "0`4,q^ (mFs5y[FzLrg{xǹJ8oZ](?}qI;C1ZɃ?׽]-D`%O\\i1\U*1H8Y>3LFƅ/fRe=E?_=@[ZECbL&$1fgX1S`)"g#tv!33+ۿ9LA^.g*a@YiCיʏp\\鹏Q4C/L1+OpSL%HfX̾pŋNrƱirZS_4`0bi5)_pV(ŝ5.ד~uX=xu=}zY9S/2_z]\ָK" /Q KPLƙ_?U4y<2Fa0d`0a1Af휽tjͳ4kZ5ᷘuœGOñKC_O@~"y.E=90x_ԏ ]9~>}Ƿw>_xݏ00 .]G5 m7ܛϯyԺaTSsgo3/r gt$Vm-֊(PrM\J"nҥ Vq/)]I\|_ͯI1/RYAU'QC `ajvVatj怏zERHęٗ &c;^r-# dRKJ3\fr)'Aq$<3g|Y7sȞḦ́Ki8Jx Pk,ϸ2fV rEea<k\`eɐ`T4FKj\,%;ҺdGC8h5po-,IQ(-F-aR/1",DDꥦ0+CʘtǷvll+zrs4a;&v>~qbY0tD󥹞^,(U{_OS^ 8g  TҁĂ:Ɠdm^RP*%n씃k0.`ҳw 󛃊(jQ6wW .fޖF\0R" Yi߮__5*){ov,Xա{/eO|F* >̃%/5ӨXI ޼hvD1yEυJU }:J:]8'5PGyq2Mm_sha>xQ+ch-Ҩ;}mm&,]F{ ia#EiYkm7Zówa~^YI($a,6TSz)d*Ճ~!PMTct*(ڃ3㱊B^G2@焮p>Ӎ ]cm99Ƀ2L'dcᴉSzd- ǟbc,O&(K婤%h9)Fb-9sد[; sw7c/\oJzv{|)t=P7qL 'b/NJOql'qH[>=qs ]/ N58e}y==q3C|O)C()< `q:&OWLUV#y&Ol|& ۭî.\m{11o_3[EYꨭ Y EHEWZWs`]j2+kM8UOc1Y圖f13!=@yS##&2l$25sm)7y|7 b /BE +m#G0_Qȇlfvg<`ik#KiY-Vr:[ME6Udrb8Z%]T։-j4fU&X]&kĥ+ְ9g*WBrܞ92fh9WW:;;V|KJUPLأů~iF7Ҿ/S/-sCΜB>=i4zZ׹)Zy$k]"P } 'aʻ>#aQp 8b^ qI D_ . 9 ~'Sx ~Ec`9'W?(vUi! !nM$ʊ c.(\Զph?O24J$y܌{Ϭ$L(<(cRk0iICJF0^ y6cLP-3m˱}brh[Ho$ Y?-m)anL޴-?I٘|= dF}sf~>m&Q;M5I9p0t臽QNYLZ&Jn-~. 0:z.|yz&5a "tB P)1Z2({Aqo J=ے1#*AUk=^Velo|c-; rsxCXw8λ,l% ]WҼK\cIIO׽ \.51K@q\8scc I|IψO9_LIExb-ŭn o8#l3& 7ݑ- ɝȼoީ4oI1bm9x4៼s8Jؔub-IF*ԯYb& /+XD5.*e1rSd9Nk)w.]r80YʕxuJ owO'/{yoR8Xm|;;1.e|?†c{wpڋ^gq0D?'}* H?Tn0 --mI u3Y@Pjr7wS/W6 V-GٮFipXPS/yMNgP1H9s'ruPHY"!(O'c&RWez:iNOG89?t;F̔e"{pI VsE'b9ïleDHG4ǟ?h>CQ,i҉"7(9Y4ގ*J>rG\{F2:Q(Yg $j͆n8tD-F@ܓ*-W>Sv^ٚb]lT*oYTFDv!&Ad,\͘EaCT\"u[<؍HlۨdZs$GIqdPD*$b)U1H*TM(]2nF)'Bް,d,<,S6~n$MV6E6w[>~Ľ~' "o@b7&<RU3$(ѸT^3A%Ĉ6`d1t:يV%L%!N ! *Q!3Hdi/% օ$Ajg˖Z{CӶ.z[ mo'{)Cu60CzhzhWY]% _{播9J[Hu"H5'pwX 3UW>*po"?5-Ht"p619CZ0պCZwșCbH&$O8I-h4FDګ(,ZŌOIj#!1JGר;sG)5 rⶢ2 l-;-bka=><2>ג's~sdQQsZh.P፧:c\g,<҂F0䩉6%>Y"f<^ŽxM5I`YcZx&r,<'fSx2 xZcI5i0 V}cqe`8Yd}k16F@Ԝڄ;Uqx:.9Τzpxgp p)bV9Z]nj=Wb2/U"pfx)J23/Ixڡ|GNT$r pߨ\h&6F?\*Vd)D Qϊ\gtRdN֝Il_+s?&O~M? ΞQsbo~uo&kbm E<+?χ;۟/n/!5#q#X?}0~fy|DF ?d`^|B1M2|ͣ&Y7j\ f!r23 >|D:!qMhg0)=c) JtOw|wx͏~?ݼ e_>|=:W6:&{{l}CfCs ; -X..&\3sKGbvlO_~{?tK˟=jVη[/N80$~[vRťX|>T"D#x 5+:FTW1[S vN09~g` Tn ~Sc?k+ \xe؟j#[< OJ߶RG|(x)H@ It*9*7!΅<F*uǶsQh8D-K2+x[ G#RV^NFPe? ϥBzc{Uґw>U o^:ۊsxmʤs ؆&px/'a;< (0P Dm]m6g&YPFpiH5  Gt\3kSGGaY81e@)cP I@}O5մĜS*"Zy*)R/{ ^J%qe\tej{ٻLdmٻo읦4;Xj'gO/uMIH]@pBROJ%UJȽgּBR³׉U;4MX.9G'&˙4i ?lT! >P1@#ƀfgB@:P8 Zj} aclWQ&oDVR9Pq_issRigjxYNSI*)&w(uIJ҆ ,DY# "F![myLgSd\MD| !zOP쓊kmk99F\n_*8E7Q"۔cAa%B@QMnoGOaN;7q%~f> 5,}F:e6c ޭ]H%w!/QC7.*7> 1KTdžYݪt'ڦ)h<36UYe-ߴ|s|s0Kr2*\oK8햶[[Zbs7SsM EpQ˗ 3[ltYZW'f)3}(\:aJsr+&Vҭi t͕ba IY$8އ5L&m\4j,L$LeRc~ϧm7X$<4mx`|[?{WƑ $@Z}1I$s/!*qL6[~Ou7WII,.ۀ[UgΩp߲A]{Je0?ySTJY*!z Mq3a5ģi'/TZPi9Ih͊g9P^ *1o&IW]Ċ%T2'FS+P(zD\hiDZ GFs]8{pD.5JŬƓY-G8^jeI;XR.HvjjӔcA{=/ڛʃ*o^&% bB dw 3 -7j*"8^SZΌNj$ʀJhF;&)To.[ux2? ];ۄ:տ-vV_ǻAe?ieaiE_V>_7E;KMȷSΞ`=ԝdl+ 6VwKnGkXR_p2f?hڥl!P)- N`N$Ōzo.󍼝u7C4CoG_f֙aφ,YoLnZ@Q-9qW"V>&\3fH;\`e&R]Єj%tB~'Z/}tRAGwF9=DBhd!'Mj+-yt IOOaxvrդB<ċzU2o`/Y5Gt&qdm^RPdp rӘG"heFASQ`pjϝ$Lfζg>FֱcvWWs_ŦTSuOF5:agS˕z.5:5\S v5:zL׳WKHcoY}S5-(_߀<^Wԝ_ Pr^HH$` uᄱp GEOD% Ƴ@;fC$:E$)8jB4gU-ҡAňsru׽'crjʀm3o&՜UX;Ot*tOww蔈DDE""QHD?tOP'T[9CCPxYqxUt4 İRNq,z̝vX-"r38H}aFhNPeENH-N73l/sT qY'&(SҹVZa6[(礼"*&VP0m+@[*( VvH?Ʀ-O#^u|le¸gVv1b=-_Sb@?rRj͜'ő;F+ 2XHz q6nW|MC i%}\xUOk;l.3)*~tiл ÌJ`/9N=)!uf]`p Yt $h>(L+uz\8+HFǤ"9©= L%1#AyLXc"Sy\ehʏͬ!x+DD4,rj<< R9;mA)5QBuP!83pV# *$T`▉ќy%<ڌZsk$c\lp)בplG>UkA k8?Nwc:go_Z$K^ sub$oc}J7%f(o ߽N>߫.TeT-[EcoŘL23H)D0tv)3S 0Esm]  ʊN۔:rɪr0fw!}wQ{pҜ/#oRdFN]T%7ߚ*I|TP83b܃){4`0?+@JIVco陚{VW~.\5[g\/a/꽥z " ᯷ݕ6woB-3QZgb|LmӐiX4чᨒQ)a4ZFtd9kg%hӇlY57Lga#ibإoK_< F=90h uG=XX_7w!}.0QOoA|;0 Yy m|ܛw>bjkjo:5USs ȼ<#{}ԂJlgb J?~vKMT?i5sO`u[fjU]]|ܵD 4 ·^me4Sqé*fKpc(j$ 'S0fT`7`, *0wHsIMku֏My+nN Ӡ8@z$vh1T{)r8`Stg;2E?ċA)#bX#6$Jj >`g0.0o:-0X@;ruREē@МEf=ZrumPNVk0`oux+o5})o.}WNK6l &c Gc%"@ F/pJ0ûԀ5fТ D[ç\AQ yn U cJ#68R$#Z{1[,CPTxշR|%m FI026 ߇ mz \9Uwsxñ1%9#(VF֡Q::29 6T%wLzK0X0#Al# $QSW[-wa2cXb& KQ!d 2RE zcB)7j<[֑aYA[R.k%'RHDc:ٖX/\%oTš!@vũQ z >:y$ 7K9_i3rFBB,ZXJL!0"~ |[(OBf;?Y~%c Dٔ14B*ӁpD)QaLt ͫ=Lfy^uCc|p/Em"*DBE $ƃ<3H͹K1L x\t̛yBiNW- >nu/%g|79@!ıG/}Z.%vZΎS`q G]ݧzYVx~Sa31:KTڊVp! BNu.!Viqf2鋕700ԸXO|oh㽅$ϳdk'+&]OoUՊ7.OϠo7[ĥo&6'|W-kG[w2~k_4,Ȳo:9IٛK1uAw 7-eg*!pGX rAwK(N^VtK!26eea[iA lU\I{* B /lPqx.+}pDz1454 B)44):0W&!bn"#pL$~著@BҨ]r-ҍ\앶~uhTPFec:_i3[̘nU:::Յ5Qa *ި?*t"ud1KI\f@I<CTbrH8+CBʹ-Cښ IUp;oL~11D"+j$+cV{ۡXN:@PBCK,,3]/Jf4|@87Dmea/a\0Ky/aN -#[XcI"|!t^[% rtr{p?={oޑo't};kA5[cA"LJ)tDdw 3a68&,gaȗ7t>/ntsà a<n5Jlxٱxp_`B<= S띱Vc&ye4zl5ͭl6ΏooD4POH;GkJYT1$F0)^""vPC 50E.(qCvYIRroDBɵ,.ǕqT8U3NoN%8jT*cwehJT6$AKY\p^CQRţ&[El2"|S%JZ]JQKbCQP[6G9 ]`O"g&+)YIJaJbJ,T:I@%hD I$0hSQjVҁ}Xľh)_My3zYpll޾rȵu3u͕@hszrdO/=E[7]낕+r6@8ݮ=6.Q}R07rwsRoV^zlXlbϛ|SMy9mJq7?*95Ǔ5yu1[:\J^r~gQ+-sW\(-ep,sѣo5;pb}3_uf6d.sѶ=&wm;/n׹zqkuL>OݥoZsm*|ux8k{DjmK椇ػ]^GQ/{3?^_=3TAAAAj0%xks̬Mg+<\.56T@01 ,:V85Cp>8"K*8 m  :|v03#[+̏'Ş@S+e x+g^RG>!fDA"WB /[ˌ\h&6F NpgՎs%Saeeh ({.FΖLy_y4i\Zq'_U.>kbAvNE4(pŃ@,WsSGЌ (K3 Nj- sIp%^ \Or{ u~l̢I2xdd T4SК oTDD%YHQϕǖq0v~|Voa݆0 4ז@VRE.ruQF5WP/?4Ҵ ! )-Tn=rUW{|2Df0F8HU S{wL-Ŋ}6[ =w8>Q8&9|`Qɕ/fWCN{J(%!Zzѓ1麇rTF֏:UzVNr e$,}>We)=Q(xǰSTk:uv{u컷߾.~wg|2}oo޿W? LU$}6 <  oMq\C Vg͸#%5h~fO|}3/FrvDAWV (]NқRTq)nW>ZC<  ql6]4rQ-éfRH~;mY#s0fa~*0!~VD<1"ʰt֏oHuuqćG-$K"pB\H9MNpm4ը{i\?7q?R;0l9`$jYR&YCRP>7іbd{QPEP8:ctēA)z٢'W=5u[oW:]=k~򭲻!}؄ C3_ R<,޹&$4F h qa`k)/$ɽ@.ad=)i{pIO#[_VH'H4+^Z3˰p޻>0`*HPYӆD ixJk3Z%;кd1$'Ť4# Jo[U@bƧ$KR%ã+cR &N &dHfEy:鉶6S-W-cg}s5~%&5rz5ۇmfo<6CwKEfFKV&emҞ-U1%R%9%Qq.MVUVjUuU)2YjYɲ4ehZUt9@x&S%4K`&6:po .\,LWrG&$U{eCsts{0ɢZ{-;RĔ oPPd[py|T'I57;,K9}jqoLxD6S4"T%$OVa5u"EN*MS޸$9GLδʈMH", qC 42&(řP v6Gb!N֗FXٌP'G_~\qcqv㰅G |=h|m>!^.Q \E4:I%e҄#RG,KQ**m@3XZ6 "F!zBӴOg&\U26nblxVfo 8c2Lr R+8c25@hZ> 3X { QmJ1pChB &^WW{V,N8{O`t~׋{,Z1{p1 j,7t&rnu"ngž=}_{KCշ.OMϠ[?_Mg]}[fL0[m˛f鷲/U`tY*7)7;sIN7U@ ~gomiÏab{kE [f]^AmfKed>"U0`O0YUBW bhy´]J,;y)Dg-asm& " %t\Y]fN;$-w^5cr;KXsb7&7i +wwmkɎio^11‹0[>Ck(-'K7͆ }m'3ڲpYVLЬf^Zl ۄwc՘tl'myy؛5͗ـm;eb(N*z*ZΒ2N/<۔S#CQq؝r~ry f"Kז{ Z(&)D2NOH DBoZtZ1x9hw36M=K!ozz]wigd?zO_ ]η6da8ŠDPJgm (0OXZskM1~̽-O?},{ԇ7kK/^hc"1Mu&ruPH`T<j[#Tbkga|&ud -B R[D\&4(C9qhb1%Or?I(4HG% E2>9~4 2O}r\k҉"6YDyވݴK8v-\y]OݼN7,Z~}#ȓ6B5 =YSkO欽‘ly(7,*KWBx 2}>EaC4\"Rǀ q^mjFU !9Gh2T=z*$Pb)U1H*TMJkbl֌Q^ta1x.u!p2܆u..?l|Nn&~v5h`04 /].f"VAs415O^3bʩHVڣ%Xʆ pg#9YV\*/I{DD02WtRln4 KhbT 6T]l UT;-(񊠍Gj D%L{^.$G{-zl>jT*cN7Ӕ !\m85I*ʃ ༆(G=M gC [V%J.%%lB!ſ۶Ъ6Q60؝G9g&+)YIq tgǴ̏%?ځϧ\FiB\YRc 6N-դzN~ia*@sP7iM 8% k]YoI+_ c{.=`1D"e,/odIHJ,^RlYY_#b&` ʹeZ4)\ІH1AxGU:ꎌEU30~a74(QSBTЛG f}k{X30y:!TăCx ;UAⓔei {2r^Rʻ9ۿ}YgޢN ^ZE.e~wKmիut{݊hh86;0&U=5H>yV*׋5֦[zƑ6*k xfzz{py u:]qY JXFji^%pֱ$'Lu/{~%g$ 4&!pkKGؔMq <u Aznl $ m} O5Zks 2%R ()WہH@J>>BisBL`Mɴ22r*L3ix2fW>q5'#ZJ屋L%o:/Dq%@pBNH\e9qTR T2ӈ($h.NI\!ɈL.Lu*Su#^R58q2NE\ejыLzJ %=!q~:L^w+q5ŕ]Hq6ks7寣Z?~L$/L*jͤy%)&2"$c2lJD4P)YL:@e>!,hC^"ow]Gw=]Rƽd}2f ˫F5yƗ˫vh㭗tdUB97E?^y*vaXb& /+XD5.|-,ݯ G%$4Q)8IIBK@uӤOd'N,i.NzpBcITI .&K)paQŵq3QE[{o,,G"g +*$xr $YO8 Z)S 7@ q4!R ښZDV[-!zPtugN' C-(E'7 冊&i_1}m QiCR0G jJ+fxF" 7 0KJ  LԚ&^ &Qkz=~cᱚ]3X o>t}\Iڠa9[v}/zpd_r"^ۓbtw 7)]ZG]aW-od;W~b ~=Qe9f`4zݙ;>693}nkN?u몋L l ߔis}9 #v2 Ia8hR{oKh0v HJA(O#|_(Em18(\8Nj"J(ʤ  Ps06r6 Lsuz癒/Pz7}_妜\+f&:27A;rFxhs{#eE Z B|^ ߹b. uReH%q5=q#rmyx#؏(IZ?Y%ߔapm}u Zy2R_(y |j-5';j4s0J('욜](9uCWgcɿZhwVÈl$ՙx/.'WQ)p ǝ9>]eb4tuC˶ԝwk[ *ʾ6\ TI |GN_65d0mtjuen,|FISknN[۳-u]fx9,8{HFSVLi_\֖qVy3,.mw}T<3|Oz ~ T?fv|DF +e`^Ŭ{dl_&7~dUO(O8@H~^o(R+{mFr  PJ .|!o?|L>-j *F޶" }#mz4>ohaG V sESxC3*q|hp[KSϛo}{sI2NZstL)pV|F6lZ~JC< T?h\Ev981VHhbQE̖TK$r, ' }ze 1ڊ#@'F5^soߑJeuqF-$K"pB\H9MNpm4QoD>%8F2f{bJ;0l\4Q˒6 "GMG"EhPqxs[uԶ'#dCʄ͑t'W=5uxe¤s [( ?zI`s:\u%ej{L$MX/ -󾷃8cR <u'Ր6j;W$Y(ךnئ39SCq9xgsE9CD X1 UXR_wKdq] 7o[G^i kZTDB7Oqmbr4`K%;Һd1$'Ť4# ro[U@bƧ$KR%c0ǃ@$SBe80(3Y`jDzME c}g}PlsumLi-͂VhD^QϪ=Z_]0/ppP 7&h7A9/xR4֙䢡BMr7T$4ĬP* e#(OI9ڔg:EKbLx,l\99x)z|K4鵓實*֪XmQj,dYgdY I5Ƅܑ rCِM-`IS&YbTkbG2P፧4Dt,҂F0qeFK8=96̔3sYȸh rT7mFu[7~B Vyx2B։FhTR.jGtnݦ/4AnX,q+j}..@\l%d~g^Vifi,D`MQMSAdtڮ﷯Vc/24޴yG{魻,Q?4ۯ}+Ȳo9J @D ghf Jq&82*G9KAs^ݭ'oarr8J;ַ̪(w$?`Ť*Vv/WZ3n8s g60sn8X S)VV^~fCk]stVo1IbP Q O{f;1g Fd=ǎ۰dGzҌDԁNT5^MՊgUPԠ8ykr}|y[AǥTkf2QJMR0("d.<*z;L|Bkֶ7Tu>q"۝ 6:꣠ BβzݤگJKυwoi\ _#Ln(0P  d6 M٠lБ B8H<sR6Jfx@P9iXK$#l@V+~BPƹIB81 LP6 ψO9T7~%t1'IlQ\yߐ/]tTSwd`9y@AJK9_i㫫|&XrjmHjgNRI4C#(6,Qsky ¼cLlS~N[|s(v>|7o$/V8ŕ=Za=ʙqGїM>/l=+3Ҿfe7(lx /^:G+u :8aq0o' Ҟ13>T#E/&h$oSAhk\1t4rJ+'>Z=S,l}#ls+R;f(3? #k;偙4N{ѩ#J la~T' eh~u&]?Ƚ/6y{Q<:޵B$@FKm}8l] /1J)R,kO"␔ԼH%NsԳT5?y|[ȕm2[Q/y۫Rz59^QQQ4J.xȟD Dy &g F8j$n\7U޸-",OxϕQ!iB@bJ0a D,g75C#eDHG4ǟ?h>CQ 5ZNѾZ,/oSU(Wm>y8̡bg ǷMd _fsqNrkMNMVS=-gO;ThYTFDv!&Ad,EaC4\"Rǀ8/ıE%*#ZXTg EO1 8tJ`U Jn)` Ғ9%frYXlgT²upe[we3DJŴ߶~ş `0ߋa2jF)b48Gs+Q5c:)Dk=\/5l;<.Cl`;8bK%8 A1TBshF&J>R)EvFð`\.;^*ԆNjwvӎhE$a!zlJD4"#A  TYyT;-(ȐZHBY"ED2 ?.$GV.~HbQD,tWSB4[pjT,e* eUc3. =P$ D[ZH9 ꢌZ`&6t(ۿ~N5J5H s @K.q[씐Ѽw-Hmc1RYFzI\|weB/fJc}6 =7C{?W SZs+="q7Oc%Iis΃237?}}0u??2_h-#[1&UQj4s0J('솜]*9uvzB,L56_K-J޸\AzrZsr.H_L/.q&Ri:?lg{xx?LJލmtpexw뛥c TI |3 >|We9=ÛpS{_=߿_xㇿcǯ?] Ň|u+0>0)*:+x2?Mq\S V ˼#}Jl ھֆ8~s2,E9ɇNZ~]Y\C/̾d#_O ެ_CVRC ̧qc&w2F)4 ~c ~v0!~VD<1"ʰ?:om}htS:ŸqQ @ $ҩ܄8F 4RN\pMg;2U꿟? jgfM>8D-K2+x[ *g㛎hKEV1n{B9T /*S~/zQԉwPbװaVĖ"ѩ 9n* l)ds,Zʗ!<,&Ma{ j*f|JR) $U24"` `A` y* @Ce׉\>%yޔ ϳupyFsJgdwh;wƖ.Ɂ"q*˩OB°Ԝ0ooU]p 6 ByO|<(BBA;8ӎdAq:y4m@ rI5:`#lm@+Z\Xύ$!\.~nP IQA&¿LIExbj1rCUՒ~'Ymh@PW983]T.=#\<dhQ[,V_n͕IUH]ArJR<͓JjKNb4Iiʙs1YΤL[D4(¢ >P1@#ƀfgBAE&rT@ 3ĉ9;V/^f]w<}6h.#D}2حf %HSjKl2iqRG,KQ**m@3X\ "F!;-¼SN4>o![08T%v͑0o߁Zȍm EH6%`PY R!PD5dVI6X~P FY 67liP>zugƠ'"JՀa<#y_˞'QͥKsggw!fx6nsp~e1 .oFzNȯWfp[Y2̂,㛓䛽x7cU8kj~zBZfo3u״PWu|ylwǙk8c̎1ǘ+uǡ-|u~dL7^^VL-[zWUBdUgҰ?Bڶ<=xsָr$V\[ǷӪ8Hq[)'c;\z4r-[]xgc{ESu0zqذr1{X)4g_pqeoy-71a71GԓmY=Æfx:asgPiAYg"J}}-M7+]*u7a[qR7 iwHER%-Tf2QJMR0("dV<*{߅ g;'OSZNieI'X 9iujP{ 6#H0ʹNl6#Z%OHrJfi3Q"TBB\YReVN-">)R-}T0rs'z"CNx-t?\LT1A9hx577"I BCǧ7rC,x~kFm.WjkRʱ3d-:B5Qqޘ|TM0dC@A2|PHUS9L`s*v wz>[ȲQFGυD""L4BDN!*7F+Bכ[w;u23\_'eI ztTOm;wsGhGXfw[]gJ{_e`ͥ5Nd佇 xjkZInWZ,/ZlZl6ݶouxXUQ~A H-8XC'Luu+;~%g$%4&!pkK GؔMq <-{-~O~ڎb>^zxgOq򪇗O56LW_kާw8EDsPh$;ۘD"q!VSo,""<:<ם;o瀺3 :V^}*Fi*1-2 )K$)P5rLtv:<# mG,H 1sesHРX EB,e0(2HG%*E2>9,h)D<b9tDEDpJKagQkFv%<9dt=tWAɌ-4IMnm[js"cӋ=<蕐.Ad,zaQؐfM*1nDEQjBHΑ 'S}>*[dj_ :%*I% A42g32g){b3X c!X%<&W"lzƝ|Xzgz/l1b ,7\F(EFhnc""O^3bʩHVڣ%Xr0dFs \*/I{DD02WJ)q6#Òqy(]lv<0jCE`x6ZyL17TͦDL>9O#9P)YL:@$iA-ddЊFbMK#b~`]H"F5-[[~^" bTDQTDXHh"7$( X* eU|(_vԮiU^ӎZ^@)6TVl!j(t")L(4(^8L°DIlch p64&qXnc5b@-Uc_O%ys^i OA|!Y2^R*e0ΩDQ?ϋæz꣈uDQ` hJgq9("Kg)`zQ/W~[l~~-@nD'[G_`UϯYUuŴ{Y:.-_lx?cP\3~DaX1 Q\C0Ҳ?,K Lh GW(0escwIR:\e)pJj"1#xY\N*KT7WJsMrpSGWY\i:\ Wo  \eq:Bi5WYJ*\A2)vXɯu< ;ݻ,rG;Tw[ZCW~W,qF=%(e9St$D1*ʒSs!wx=PԒ%!5t^k:zMG5t^k:zMGWHL,B䍚Q7jFMި5y&oTUi)陧2ۿ6=+Pptl }Cw\rI;cD^p%b=H{ivG< I9pfq})J23/Ix3[ +M=C,)c$ wxpV[8Q"9&g M7sfb~_s;$K .8]+ܔ9|~j9rE{mm3(pŃ@,D踧:1 hAQq.Qπ8|!%/uq>zM#;ǵ$.gJF#b J IhEB!J.(lpvB=8jͷѠ;,eVm?&!zpA_6`pm Dh*t ꢌZk44NL\.MZQ$a9%8­Y`U.(#yZwcY"=bs )bh[݅PPaWV\wGg7Tj~f_[w?W~j/p5}~F)+\گ~u/c.9p^xgI ~ǰՏՏY,QCẒWAw=3ɺ΁^TF?uUO]WCO}8ב`篃Q[j:#~JOhg0(c(BF%ΉN;;_x?}~_>|7qD̢9yM(L"eIq\WfzKAtD[*Rqڋ Udj'WgNߕc#--wʄĢ'W=u+c;+t:]=k~4b>#Y`HWMVSzGGzG gns]y99B# g15Md!{$i6c,fRZ-ڒRag)#Y 3˰|Va.T6 XӆD ixJk3Z%;кd1$'Ť4# jo[U@bƧ$KR%ã+9=0"ALjM<(ÁDVHu mm".B1q9޿^dNv۶D~hn(W=K5vYGT#y J:k)@ Ih.NcGڱJffk_+-Aca=M`;$$=U|5Sz9h(. P7`=M& luvnG8H<sR6Jfx@qqʹBN$| ,+JUq.@N%eL*$ <#:A&¿LIExbj1q6SUDB̬֜O0nr*Ŕ b˷^郟6GJ$GMn Z8l#)a FJ>Q"wN̞ݳpH%}U'RT14LL9,gp-PFlGXaQKH!jR @bgqT9* Dj}iUL 1q4x%ĪyLMvM~K|ݻrm>!vJB"m-2iqRG,KQ**m@3X\ "F!Q{{%FѬ7mӇ- r n[ijɾ1LitG s:b䏓V?NJi+\="\tzA,y.)|%ƽM;yh4VmT!~y{l馾p'j} zq'U{/B>"t.83WЛ!Muj==}_˞'uzxGd~Xn6 ֧iM {YkU%,zS \ҏ Ƶ*]-՜[F꣼Vv\1mNcH; ܙvD=ŅrjYiP Lu>˳ZH!KHQUOL3Ub9̩g(~#m[-n>RlsC$5XCQM4BDN*BA!T nV< ݶr7]kߓ*51I>0l!3_\?ve[{U@`^N3Օī F"UZ:W'rֱ$'L׽D}y`/؛+9=c$9g|0)F=^ ^RpMIeɾQ蘼_U C &p*./!ԋn?|;=6xx5ZY}Ż&sZmR${]%rE͟D Dy &g F8J]N+/^LYfQ-OxϕQ!iB@bJ0a D,g? ((-Gp(7)E/Q"ϮSP4p_Wͱ㓮ԎW7aբ׷ޜF69in8nrc\3S[/X%uEQyssϢ4z%$50Eou4, ̠I:؍H[V) Rk3e=V*$Pb)U1H*TMR9qr,,B^ Y;ω_f3O'%#cv9]dPO[~ Tv4| ߲}ψ킰p5u48Gs+y1SNEM.iV6!k!60ceRy NBP#$&DҥTJdVcAb\Ԇ¨ =j:hE`0܈ j6%byQ#A  TYDUyv|%(\荢t!VZQ9k=7 冊d_Erkv[TCR8Ҋ&ކ@f즓c{"G pN&f:yRo^Ƨ$73sͷ Sw.vKZԡSY?>嬨8lq9Ic'ލnj6 G.2v>vJ3\WɚW7}mx~Uws>q5STMxoJhHz;fVrjS|'{:Xn}6aYQr ν{R;.BtgО>]6;sR|`/2N2ThE !UcS @.o]FZU{|.hY˵pAb"&?ygLD 7>\G0K~ x%3{k#@1q0}uW՜C]aV2\ ,F1끟OPTq)nw>ѨCWθT6 XӊD* *縩xJk3Wp}M# C2!yYLhA1 &^¢U6RbcIdxtEO0ǃ@$SBe80(3Y`jDzM>E(FΖcW 7B]AnWωxV8lCVny(*{==YϊIZ_R:m!AIg-E40ʣQR@IA`XgRPj J7T$4ĬP* e#'֤|ڔg:EKbLxB#g?g}r¥0W_l]EmYv,dY*N4,(f!W7ͻS&$K 5Ш+/XDŽ5qћ;2!őpt&8"R BQI-$I2MT݀ph.)IHp .p 1Yubɘ Ij(jw o<Ͻ ǐ[_ΚݾWj,Wc*0޵4#e1[#Ӈؙ9aہ1E$mzc&HHE")",K,G/DfsBJb:z"%g8_ iCR-Aw<OjtqT + PyP&iGwqɂ uiH(A rI5:/<``(UBPƹIB81 L$x sN?":y"i4]IPymR7~Z}ŕ IUH]ArJR| G9E81{vώ"ՖUHQhJӔobJΑGd93m2bu>EU!r PL($v6Gb!O֗fXŚΰ'WB߶~r'{B/r,A ߾4wϿs%2:ԖNRI4(RJ (k98H¼QNy(hޛ:f_-Ҳ߁A!CR]Vvi!nϷgM;+yTVThugϼtzvJM&݇$;])h(":QB}9 c揔&1O 9gv}^Ǡv)x(f__y%X{o,zli'3c(Tl- 'a܀!BXO6F)`75T{'t"Yb|N;`x+@#M4Ε1(!8SRVZ(AQH$@\Vds-ϧxCe?oS$~͵%vW3ߤ"|Z_2<;O 2ǼZFmp1Iby_Ea .L:e!OMfB>BO<yE@HB|C$J)E"&~\IPT=ULy O< <#k}㔄IX3/EZJV;1ÁtZ_N0)}b?JL$RTڲb R<`QHR24ϻU=V]pROwDF^PPl}1ICss%쓆b+5]74r-2*WBJRؕZWDT.j[9 SR6S1VZ@qhnА#yD0^ y6cLZ4ˊGKy^I-/ٚ_Pm ov\VZQj2UĜB/FXҺ8v܉9թ#\&A\Z7CVZ8$4ngyS% YZET <1T <0bT*X]h.!Ϻ0:z.|ȉXEh:*BA!T nV<.3oWq};۾&dߚWydK3jdz*cމ֜odz,mp\⤂H“o$ Ou" ;:x/x/xݯ9¤9.",xI)NoFxY=9~a_[sϴ?†=mxm^;s"cqGCGaf?~IP2p6~.$kAh47O7_I*Y]ˠ[5qj~|AkD^_$U=I%p+ l=菤y;DA2yq:J%<5.B-y>u@4Zn$R0NW/Mmw1r#Oqm}^3y~/ay/}* ;%ZYؕ% C@7m,,yJW@ȶЋ%Mmj]%iuC));&$&Rkݲ-;U K@;7 N_; ˺tuˏw@ zjjS^ Se;"o.]5 fs/|GulW5j/\"敒!^i{!{򖊃J#G]u6,DsEݕ%ZXX'uy3s9^^[nm~*\sM/(T\LܕP"g~/}J]7~qA5=Z= HOn5%Y_]|fKGإyRz-b r̔l`ee])qg0z8Zge$.V\rEԜڄCcP+*㒳LNw ;'=7Ċt>ȹ<#Stkqٳ0Y\hR{oKh0v HJA(eoQ|1JQ#e xV;EHNQI)J kΞgK9?%>3SunIWOG_'~۩~S_,(?q02-k5گ.OH|qOub$:A3`,,Qπ8E琒Wd?k,s,y3%#SJAiL՚ oTFD%Y|YgѸc-P(7q8Xo`p^*N? =/IIpG"BH9܌Eh\ibC''wKK-M(HВKE֣Xh;惖$6k1YE"rKm2!*Fl{#$?r%&C~ўf)HQ4^ 'C9?BYcyfG}y??jM>y+'Eh )(1QsN]MgWq`Zh%+yr8u6XkBe77W&yIp[Û_a\ʏ 1lo;uן]olk kJ|ݎW?A`gi礽rQiÖ]/_qO~R[Ӽt)]oGW> }6v E`)1 IYV_̐!)%D4_UבԌgy~tv>x ήwF(91_UbnEժZQTC%ژR7t nu7Y!Pj5jҺ|T =hsp2S ZN=@HjhRէӐR%!~QsQpAzWT*`O|h*T ǰ?z~_߿~w0&'OW[u:[4s5 ͛m4'9GN&Ch:#hnqDo/+r0u)<`E}..Nͥ< d~X .ƾy4H)ʝNeL9+o[>4(΀(P 2Z={J怫h9 u&?Qa1b[%5FX2éd+@ br 5 ʶ' h'kE}wUԞ,n; :]݀{{[ ĉځe{ 8L0+g0f^r-# dRKJ\ 3}(Gbh?wn՛f;p|%vSφ_zQCA* Rw̸c 6.BurxyarƎ ` Y+K+1ZtVLqd{,}tR;‚DBhd!QȨiYrK*Ljd<)V2""&ZH0<)c"ҙrٺyb+Ǔї/olx2vm27EΗtvܔYQQ{:9>x/TҁĂ:“dm^RP %lܘ< fc[&Pq'FǤRąr֝|#ztBŮ{ŻXb,S4a^ܑd,dYⲱkL "$hPƖPyIppd3݋$B=!މ|DH")7EANqI ql=Q:OD8˝$jh`À5KGp׾2EAjix-vAڰp+5tԗ,S4ݫkkG^b5Sf/ҵwݬ lķf\tf@WZ1b`~E:w\jj:p-n8췢val&5&sjN:>~Z'F1 T߂VS|KC S;_Ggsi7r\=F)d8ҎIKP;XԓAEŇ__'1{B ^|۠?MpjÓf>-΍'x~.7W{89ڃ`CKԀ \B(]iJCdi2]zK⚖HcWbShL#V h1C,REcLi)=73UE9DŽRn&x92J#"'w9lVJHH!$+[w֫C8#Hڙ9evкx4C\¶PP]j7j*9(?Krx?lA ؖ U`hi)1H4=i E>80zdpV!l,ARܣ#B4M2 d".Ŭ Kp3"اd8LAp8bi1g;J'br#lY'NFXU;;V4 F-Ls^+|mJ>s8Ew* )DTXb i4.EcC',ۢG"޼헋1 A&Anc%{5|7;oXfo`/N^PJFUX,Q%UԗVp)iBN7L{L&Ĝ~."~9/ڢ3RbXd.e>$"1L)#'J*gHG5e|Tkb -:hvS8$0SE%]A(̄r0DmUrBP9ST`X>puiR:z뉱A/< ˘њjLs+w ~ϼ\Δ4 njQW';=‰owiʔJx}G阘)l{B{@.Q{`.^_;k\F:+cKvP͓MuLP[jiW`+2~3mkLfi]ECs9h HD˻eƦpI^ɗp`((U,V=%>J $}BN:b3k.8-kwI`<%T2a V"v$"%ΉH $HsgzY^ځ.{7޸Sˆg+#.{^G"(%Ǯ=?׎(djWd~l|S$FNSg[zπWȳw䵙?0x6&[uX#y~723$l]p~~r.Ő]l%>atQL =KpvnC斻pu{ډitcjWx\SsBxwE/6umOxf ]vXi< ]hMO7?OռBu y: } /޽icԞC+"Vf]hddzb,a^}PreK\,?*6.Qd eJ+gI1Q:CSNUHR#-اi 4ч s!cJ eY{Kl`{!=X/P:_4 >P-UG+\JksR1'$_ŵ9.es[{8܎9:cb:98iyT6TqGPeC2NcG %uǑ^`YuȆ΍u 6Q(龃Qh&&VQ|J,=h:TdzL NVF ѺRSy{z ۥ-5)P3(38|Oak߽Zfm>TYաNO})^q3ae5b^p1an(}&=xB$)M:ޗX\,*k ^;moz;zpM_6nf&%VYRx sZlKJY{\ cbbV*Gj*u 2Oyy'0 nb _bF=Wݼq5W>iݴ_CZ92mMAvG  (/OgaWTގ,Vי*^Z#I)0EΊQv%3|\PQ_&n(Bh0wPL9& LA>6[L6NKe~,1BxKWU#G\o^>,""_4V0#*)CL&zG[,E_+{*or ڑ!>l~7֎͐UsbRrmM .1̈CĬ^A~]w%5\kq+uvXj,-o7X~-io/xdw2(, XX1c2b=6MVHKĭ/Kmt.܂˫p Db'RK5,*bx#4kDDr"2с!bD <* 9N|0a$Y*Jm;I4Mj$aBZI5>GlkwfhJڈ$˷Z{rVKPvPuV6I]S8o}VA@XI.(FX $8&"D'PH@٬Q(֢$ eLGmW12#XP1g&ug=g\Rv 팻B{0Ei4o^]WNϧyGgt9PMEPc* TQ :)BTDZJ|AJ82 tA !D%$ItY$R:"*uEvT%r_vqW2smsd5GB]@5#Ah\h'Šd^~Y1 !f EC+mFI96*. !$$ eّ^r--lkMY}I/:4Yb}X҄51D&("Q1cdj[2Vg=F4U}Aj㱈;DqՊwgi*եg2'ϒZ"{ͤD奧D "r" *"ZS6.ʹJs5\pG $פ!%-@K9r]o'_gUX\ԕqQu^C㠏.k2ֿt{p|o $P"IZXĵx(I IZqH9w!KZzTIX-<*n{/x:\w9V_#\)i: `ઈH])ܘ+X1p0pUzኤֱW4q@pE#1XSpô2HJ^%\YY;Ds m*YO_~Ӄ}Ʌ7Gs߽\PD.EDΑI .eA99-"Dja"=Ǟ2B߱_...g0*YZ571o7m_ ?Kܕ$0BD ,H8g2Q9$4ZпuZFEF.o?_o\OZS[wqI2&kTy2s θdj%%؍FϏS6g?b/׹T.|//.17~  up4hosp~/.8=\0cztZ wMkۓ˓\+?:!j`Ɗ٫ ״W7Tj|SE~NGHlu?}w0LY s;=ϭ7s;(6u;s;`lHRґ|0b0|1EQ4eN~1NfyjˁY:`GQho gTTGHö+ ?Gヂ;8Fq%*Zw-ǿ?| }xf*rLOO zCK [O&|q+/-qrIr $G^~B.u=)N7&YŽB \QMj~>!MKRTRc !82p[.{xXi#-;ս j.r>Vӓ)n `w'òo2xA8L&,f12c$mŷKC7:+RFOίۨB?D )2D$yP^g1Jc- CYE`Vmv'ܘ@ ,T#K}`F9pʢ`)*+Yo/Opy,Gެݛ i}nE=^%p;vYjb>G_: E3>Н'1y#Z+hLR-~w8ij T5%nj&g!64&4uT;gp6;@x9I2r`Ay&cq%YщTU"r!% KeD6W X[C 2 5/LfQVg=U%nf>).ӝ8ݡe}C.GG1wD1-ƕ[,|)'~|A>3L7)@rRFq&4-Y|59(og`N8d NJTRHIeC2,!Kb2<`6j-j@s9BA ܅ 8a%C cf.> Zz4[Ui|m$>Ƒv6ՖJ+ BgN4W." SB)o坰bMަqroơ߁A%Cnp ~7o3 m74dj\.i81rBZfX0=J|Ol>՟X栾i09BbJ!rl/3䘉&ȅ2Xc3v7qݔ*Q~{~5z[rx ~oTwOlxSLbe%,*OD]sHrM=x%O!%!2'`0`gZZ1zrFCRkY%JeR@iUi}#o=-7FfE{kZf["=m&׎sdZY4h|[ۆ/]XgLɸ+@0䢻%m \40cldh=gPeر#P֚#&F4%Gn?je MiN+$̉ȐMD 0x N5R=}::U_*@5)H S)?%Mb"D!h2PricWQ՗P[}9['=}d4֗ty|_jOmmwToGTwmmet ŗrK;Ŝk`{J@å)F ]Aq2ȤY"4p( nݼ~m'g=>YKk;>LPgχ?ߑEvdi~e.Oi\FO&~56D>y9ah~^i.ZMm|z}iw=&)0 S9ql,F3YpVzPXq?=3~]u{#> 턣XG Ձ[̆14Et -fBA8Jȴ%&E!l>K^>D!yCH1d{ȳY{Ee85w3Knv|5zC~o<GJFpN\*2|aLIG&W 9R0.  6A+Zب3{,&*YЦE-q6#q/]M;XCxE6Z"=6K VLʤM*dx"2Ǣ:gR(2x.x#9L̐#Y$&%0Aq$/YFȨ&*a5q6akH`DEDYe"x|Q)RAx9yD4k O׉Dɉ4-n gLi$۸$6+!j>L*)i@@ Ns ReD&jO8g'_g5-y,.ʸ:\pqm9- (GtD"J);DŽH?ZDN(G  0Bu/xXM;23@؆Tx{i"F@nׅC&RHZ9V]HZa)ƒkA}@pU~_ H\աU̾UR^!\U"C+VppU++4@pEk!R;\)Ug FR)H`wE&WeCbW$a`ઈ+PH{*R.5!Z-]  \;UV}J"%v5•EmN?\ /wOL9$tjuDVəU6ULeʏn }TQB1Q2F]!lUwR% 1)SR3ȒxQ"6YXyE~4*,]2W ]2Zv%K,[6ZXJ$Z12aDAqTS$}^G!K?[VB5-~Fe#6Kll޾zBm< ҤH\'3~CbПK4$:L^< `=*l;K]7wZys3^EKwׇp^o5pL-(rmI5ݟߺM e yI*xh-nlXAmz. ]?Qйe_[X* Ş;Њ E弳>5$)Y_Bm3khUu17Yr Qw^oMEZدCw<0XfgIO[3bVzcUSw}epZ~wh咴,aX0l;ũ ojUbZs,LVx*b piLj0ZX2xP y5z N|+p78)3npD⩕m4F!V9Xm<js,NUM-z7 |}MwKZZiQX2XT.m?<=^Zn.i]4Mz<#h^%e%5M,".qϐyEG†9GER t.eP1ZBtֆl,`\ӌH&BR Ш6 :["=M4+hdۨ?0٠E,E-I Re~$I%c WةN|5RK6xXD"p^#J3D 9+iU7N$Co8W[CE^~A~3R~ CDhO_qjZ|$kxG4M_](}g>0§rS of|r_i}O_ltx-cepC@qN1&\7Zpox}xB x&m36(f`}CΥRQJ1=o֊h@zuNOA{sq1: ]L җSw)]]HfD#Y{-}{<98^kO7aM0cŢk%gJMG74hz{q3ҏ+?]Nf__&gw1dʅ0koӳ؂%GI(؊'@OByW=F~ ~̲|`b`Ų@;Munz~ާ NzꪾZNx)>BirLGGFF4sŠFĿGwqF~pɿ'~p…9y'~~GΟi&)U" | yGZ x5{=m+kvysN$m/,v| &Zz[*Pz 籅CqD`_}nwyXk#d/{: s>BVNh㜌6?Y1]F/?'IA[JgȍԿuuZC HT+D$칌`cJU'<{ \Qޑ9>7Tc/O#7pGlNd .LIQRrRULj/M*P4ؗ*1tQUbhThQU݊7W:]]x+Ξ/ފf)E "l&~e )܄Ľ3!r/LHSHZJ3ѓcgd<Sv2,U ͎H9e#c %C=jʡSV \Vg3>QGU\Di?X.7cɰ<]ڶ0/~5h,7 tdt|uG|T)_񔤖~ 4gjу%nj&g!65&ijvnGvQx9KJXag^y0d9u*O%tQ:Q\HY묭UƬe,pϭ!4,*Sjl F; ZS`ܔL/3i_wҗpZ_˛䣣;4ӻ.$@m!kv[Ha.vRHR߫B4C.$0+$Wowr|r0eL53S2vSY,T2z_܎|rϞa"'ncP@Id+mx.'1=);ʂ0`.s/X 3#Y@#,QkR$9uy=bAf-wAXM!ah؁+J >7#-cf5[EnsKcro\*Ҧ2Yia@=s"'r1JJJr.TLT2yg7Ùq|/߲aJ~gnᷝ;f7"OCu}JbhϦƱr^ lB9!-3,Ξy{|r|>v`Q-Ky6r$BSx~~A`pFÈp@rѯnU(iD.ٔլ]i9z.BbJ!rl왉Vr&ȅ2Ŭʮ}TvN鐨؎*/cx_fSǫrK>]5c'azGܽ?;t?S + (ƄD@=!sR" ΥvHrCڧ%#vPtyƇ  <3 NmYi8Efa+K@fY'p*0"}"J"mytdlL>%&.6kE{j-vЈ/uHDgNjh'H C* BhR*I}8B{j聧'CO'!Ov*h(풲IXX46 BsbZF''p#Tꑧ ° 8{c8zRx-!8΁N([cfP뭾`s2XWEn?jB Uvdw H !+ ^ x1%D3-oƥSR2 2w&&tH}'tQG! !a?#!Ӣܤ>yYx JFrѯum.a"-PҦRa-F{y,-`&Yç̕ P\J']AĽsvu-2L;Of6 d#*MLyk %>oj=vG5w))_9k9Uxb5$|bzm;{9E;"71Ӓ-V2C rXP؜OKR9AY$NT +M›7ډ%ޢy&O;t*6`ꑭ]T}A\` )Sj\5'b"D!hJ"Z#sh]d+嵒 "3W| T`iz-ggD^yi4u侊fO5v 'oYCu[Ӊȭ@XOznɎ軾>-:,盕uow;~HxlzoE,{0{xHa`jCIG |ǓLg<$bb4&s)?`+풍&VI]i&I-%fkgZ~|3܈D˓) =gl{ܷ?ݒDй{OQ Ol`~3?Z>3s@_0 ˮ)>;d}s:;6nד;QKx`A[ry˫Yq ӤH]TCQjR+p7yѰL{]Pxd=v^-YAP&[0$!Jds1hNr9+`dT>[T ) ],+K3p!k sֆ,700koDUM(tH- :o_w!?77?Hm"5L)sD l9Hĵ%R#iCԊJ;ԤP~尼w>Q/w(QIlg[9G3n|D2L\Qp>5S41lr5_xr_(Q|Bdn[ގB:`~[ĐGp0 Fk+W?|0ޟnmS;dα_mu)bx G+1mG\trGz ];fxpŚʖ/~۬6\[RK } ԣQ<8j&7p1Q]$ K444Kldi_m1e'Sw+;ޭzݚbƘ"eˬ Q!Yo18L$rFY2խ Z!eɭe J:sZ8YLf +׈UgޭBCod?7gWB( DyHgKp*S P'\#咄)O8̞l99n*h8-Y d<}a1|L1;#7,G-ܞ_Ng۳]MRdA 4&hjdpHKX݇{ !Ձpp!:RDib hfY L&o #x f, Lq%&h:Keln?N-qbw` "K*z մQ v?Zl'rn٢ojI j_k"]W&闞֠2]%0$oWo߲4bd6QV&ezT657eh۽R$L鶔o$;,x]Nc/Gw'5mŶJ? Bp4<" h=C3KXgnoF_,RB(Ѓ-R%B!A#X{c5q6!wc"ehȵғ J/NoZ-9谘;ޤp|~ZV]r)A2/.SC+DeE"]50UJ9$t4ax=-=e>:at(+Lr *Ex YDrdIKU,^A*TaiK2Vƺ[R -w YIȷD!HU\{I'kmDa=uzZiCb%+JΛ2!aIH )dh!x46 QygXԽ!5mcW{+K̅RvVKl|6L#?.R>w}BaZR4Wr%#rfM[r$B?E̻ *0r!xw)6V̶D<5hxGdMKrK@INr%.8bFKe8ui쇷4fqn,9 =ws!3]mo9+B.0%@pz=W[r$9ݒ,nI(YC[MdS$*,:ƙ"ϒr~X1~"02vM$^};k67w R`ӅnR^ &dj֊ XBJՎ;Kwi~|]?8%gQ`0W{EݷTFZͨ80s{|jht in573,`"P+LV~ٻtUwjlN^զw՘;0#a w3#~_N>cP)@Jueݲ~q PǿOgO>~ٻ0Qg?=:rD`xk~NӟѴ47i*FӜ|q. ޟ>jfI ڮ7ߕur(NҡݪYtdJ񓯀0ͯjUTRcQ**|! 8fDi_091HO*1vJSUpr8!o@u vϚ%AF=iR;7fx$_Q' ~iP@z$vh1T{)r8`Stk{wdE5<=\Kst!:"F5ykA:bC&Cf8;tT`*!0nTu#tbnc-sorMeLbZFNEU xUNW}o%R$}+_kzi/Yl-P'^(SM `7&:pMz] X ` Yd(2[熴Hd .'g|Xy{uO݇{*eyR' C;97('~Z| 3 iP S߂%ĉ-ǻ+M) C pLOR!JD 2[pbV%T8 ySLtAqPfЏIt)8E_WnCr'_a(Q! -2$!0 Ah]"E;yL1{{&Fm3Γ~Ȋ# 2r+αع0־y1qD;@`ѤH UǒC!4> % զx1;1 9%UHLUPE}a { )TBmc!\/ 0nđA80S`'HL X MUR$Gji ic;qm$qjQ:$˞lfOGkCѤח*Flj[{/tۗ=F08BjPN`Lh,AݝTw}[D .RqDJAvR) i!9JWcE!ɝ\mFNRbXd.k%JJ#2|_Ф&ptiR:z뉱A/< ˘њjL `|qAv :V`Xb7 tQre*LG 3eرr" LյCr2{Ӊ,MܱH[NB4tиj^h[QMqFܰV5Hgn44!*;y IB_-ܔ_"5O/<o7O E X !`OPIRmMVѷ~8̧X;xc1L2E=YlQKϸ(sˬP`B<P [ygb&ye4zl5ͭ {fbM5c.CzyԵ|^F +P~C9*tXSʢB)&7iL &MDg-0$5 ,0vGEwo> $$ݺP`LYHrzA⁖ RCqȦ'Qn}go% :9ܽmlrzUSf2vbfE5B NrbB3*h* <:͊B}~ @Op).}f_*Ffd j0DȘMȘdl)XHZ,yiMb tЍ=uzopzn"(L`[yS"DEIo쫔#I^Wdhd 3 j-$& fh/#vHHLBDNܧYض_PP;vl,3j[Lh5GB  \ s .c4aEd2/F.qVqBfX 9AQNq{e2a6qVaeO 0 "fӏMgFD"b4 'a$u" , $bqh% K-<j9Ҁ"V!!),QHLd$&:łK΁%MFb̈MՈ"P[$!ufӒMqQfEbKkRT;cJ<[1 AJd0@PgJh"Z\. fӎMPeCVD!Wk8?`5ۏOsQ#~q8._\+L{r'!<w/݄/$L'YfnŌ'n=wt-;\w}$&yb_8_U蒝W6z Kr^_\~R{o zU{OH8Y̳$O߯>@sJ^U?߾_E޾|0O7 ۭV\}ؗ 7DebNvRg0yQI&R+~5ϪxN*MmüaL჊|ӄ:.:o4Wmx_u'O凉WaMcQmԕ?c40N a oǟ:~SG4D]x#{ع  <:W|,DVp"D?k:Rj f(HDcݢzViCacegXtEנv ]^}nf {m '2 )JHxՃ /*Mo+ UD&@Q9f8D;m]CNn::g硬cUHT Oaugx|}; QjF!b"koG(qC92uH=X(4` L1oI_hLÃɗ-gSɒ?!kkThq{"Q;^3K0uHHʁ^;d1 s4|Y޲=3n~*Uicx̮z _u>;!60J&p.:zA bڼmF (]*]ԥWYWT:-~ZkfU!,ęwjuz ||cgJ.ׇAipy_+ WboU?3o,U9yI啯ʑ9IyWf72J1Gt#9U[{DnBb)2ﭴ;LQkzLg!Ϧrh*B= 嬷o{}^}0Zb6:Wi}WIKSo]XͰ/nI_Ly>Y=n 0GcDvϢ},R$%DJh Z$Ŋ̌#+eܻ(CAgp%>ru T'c7GvGNq@^Fz$[/[6[bJfa; .npe'ιL/쌖W.<|iJ  ]ym/=Yf>/Fj&UؑUu-Q!e w}nq5+c3|13ă71o~ KUBZq2ͳDd( OFW7s=AZ*o#].ܟB~Zb'vY._.UxbRح,>h)u5LKŵLFl םr^wEB %rBc)KD(jY6!G;:&lwg?}&2cZ;Ɲ֢k^:rۨw>ZN9 ilՌsfUU̵PRXu*ƪp- wd0V! 7CcVGȘD9ORI%V)ִapf啸}/WJh2-}EM;;SvEPΎilE?!nVSY2Y"V(+,Sk`p`P&FFc>1r W3dc@ȳBnTzCJC+e2a̷!0c+!$B Lh-QJ'Υ*"E4sqČ:M;qT$UMɷJ uS؀* U r VۓJ}B8V APP{TtgԌAQ"!\`' QpFMX)h $!(byI@+S#2̋EU(  JӠDpPRil,@x@Ht! /\9$T댉p0,dPg:c鴭_֭]b_|[1#9IhV1|Imb:qBa^D wnfغ3=m~תJf=fղB$qh<f^x1uPKqY=T۪dpҫh!R0 RnU L͇(yA3 vXmQ|QE (%)x$L䴬yE,C`rQR0-aczOA"A)ԭLx$nEf06EUgGUSY_<9Q<|NET9%$'a O6Vx}{t[:ޜռsȓR`QD̾ ]{ #@ |um^K y1j"!]%bT@5ʪ+H H֘ jcBQP*(S`i @p L>!\;djy=f @cb- FQ'0%@ "mQ:2Ggm]acm\g 1,PMܳFrqנ\U8/ڒxYk}(ѫZFD (ǔp((-t5E (+hX4BS"p#FD>XJx>j9s9vksڌ!$bU_.u _v)&d,¬""!2 .Xr준)`$diPҪ5]w*;c70&`j?®z/guً-qtKk{:ȂjL'R,PBl9!_{8U+]r딉Τ7l+ru5nOKlVuzk'*&0qe 0k,?wYvBqq&y%0tasƄ3:L9o5ľFoľFoľFoľFoľFoľFoľFoľFoľFoľFo}dnȶ|HX~p81D#9J'8$encڈ"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""%E9$-l CZ= +')@X"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""ɒ@!"p s4jJIfD=A rL ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H " z{[fYjZo/ۛW~Nr$gg@PvH@K0W`}.]zߍ;p `5\JjJUR+ WO0\5Ι W`JjCppbRRiH &\5s'\чfd^|UJpf?d;=LMNy?m?c_\~~,gӴr@uٛ{jkSϦNz,,~I 3,~=heLONj=V++A].ח?X {a39N:njlVow6ers2н]k#l;?QjӬ/ݮ;9Y5F)htVas|JBf*uUK,7pͫy6pY2/gp?^]緼:Dz>/bv~||$r : | |G34" |md IfԋMWruyw^_v阐ܐw|-CLnD_:e:jLMKGl vO/Oܫ˙߿.e+^l.9LidOx⓿/ 4?>X]v{>^/ɇ鋳ŋdBu-]Nt;u̎qz5._^oG~ /^,JYlH -R쁡6)̼?q6oVvcxr_/,}AG~Yfv ހZd^o/Y|^<=<|}.Ekfmw&Vp]rkd(E Uc^] G v}~wVGmT|pS.l'3塘T]n*awri{n ~ޖHn#l+i|f XSON"rBUerO5;w6glv赭W?{EA.l3ީBJ;O>go\lFB*4XGS+kior$uԑSGHqGX-OtNG*Iע)M]J*1E}WEc58RIQI~EAǺ˲xb>뺣6VrC1jJK?g G9d* lT7FK5zo{98 uӿVLSllx3u]o#7WQd2@ d]a,i$yl}+ϫE,xV9K/VӶ8['Z%cƕQ(8[W`u" vuOu\/xn<+9=c$9g<׈V0)F=> ^T'#[N_ȃu#\QW_l+~ځWcd=F51]!B=Ah3RMG&WMRICT.d~h#=Hۥ3jH,W.5gdxo __O&6 aƌѹWZvsu$6՝"Cw.r; 8Źj4ަEBn#zlkuZKY/uqǔʆF^RbjWN{+`0V&X9Sc7yp-܃B֝}R(D'9dr uy;^&d)s;!TQPB2^c>$3Gz5 C8n;yȍ\ڪ\{pI V3d peQؐfŀq^%1,abg& Í蘐 T)Oӈ`)TJN&PeaRYﴠ2"C  eME,2%GQu!\]k[<,FfyX+\Y0D,DDl%:%(46פ$OH>GhE@JD)eQ7IRQ k2uI3P* >,/".C8lŸrQ\\{]"сrQC%E B"ˤECaސr\<yX;+uay'w .Ք ȍ^Lqs'wȆDhE찅\;~0,C$@\rֆ$nMShus.un#J+f8jBa%%Rjr8 ?ΩDX%$}+vеn3c1)}0-kI8UCs?>e:8߮zy{}8FW2z]~6 ߜovk͍Ohy0 ִ7{>+혷C Ocns~8¨`9oJ9˖ys#NTZsku=Zsk8]ZJ_!'q)Kd%wIB$*FVƆTR6T|ˌ4',8cȼmʼ^@ͲPFs]Jv~m~[Na-MOz>`Vܩx>}^R/甗שY6F`y -U)H (dcSF nA28%aIr|{6-Zf/* Cqfyrkd5}CԜڄSc2Ur_QEgjfwX#!GS+/pWmP;w|ֹw03VG3MQj1tyIM@\i"%l/L(Em18(8Nj"0 y MP:9[Ns>?|}bLIAF~v>+w?$?.278?],K-+<|Mn;e$P zF8ND#hBlbE%jB {! vA{W(@Ud.%.gJFɃLuH5 IhE-n(;@\ X32yOU*k}omWI~m;n~^7̞Զ/yK'{2_\NFe rN\I)'vlǷ}v/uWF:^WP~^!HX u01ʯSzB;FQiT"O47;;@vןޝ}Ïo^ox3ٛߜ5:_ L3'Hd~y ]R]c-䨧MW{#Vh~~;?_}z?t>%ͨoZq7,f_Aq_6º㎟6QS׻Tq)nO>jC< ͷq'wt:2x>Ի-éjRH^, g730`'l C:$xbD 9+þYiO{)}JbQ ISQ !p.@\i&'6V{nL>WV^RF;0l9`$jYR'YCRP9(tD]*RԊE{ Uj'WCI!V'B Nђ ثǝ߀3ylo?`Îd8K:Zdxh[2 ,.SF7TA/ToWIh⼽8 y1ኼɡ&^s?2Akv.q@^zjf_|ʐ>j11 sX}p1cQXӰV8q^кmZ>}5E'u#IY ljlKwBۯo:ʼn$ 4u#%XX*]Q39%]Anxw?N?zC3ȇ^M.w?k^!}y a AI.玟ceFMU2{nۻvO--ca=]0͵"i!Dǫ?n@jk(Qd bS- *ćʃ"4(t-Tmo;ێH`NCF  Gt\3kS9l6:aYQʅsa=76p*)cP IQA&?LIExbj1r6CUՒ~!Y):rstS9ַ̥r|s?S̩|iQN8+>>Lhy8)_4") Q(فٓGc`T[BxWu"e<ÝT!קoi /EXTA҇*&hņA3CMP3@ l"*G9CH/aL ^\Qcz_ãp/?PHS'LpPe)JE AK5B$a^(dyl)z{!Jt4ul~+ )r#=PRopw4^-$1F?>ަM2`b<|ZBdcxVKdVۖ8Vj6TX wK\}E!Bux 0HMse ) I}r\)+P- 9($dru#eb u5HZXo^G}_G}w2J;7Sj$«ExwXdyycS_Ea L:eO33wC ="ϫCdEAF!v!ARJ 2@U1EEF'uy_,epk#>qqJB$pjIe KѢc-hm% L+VD@ZoN)}ԢFgML6 )jڲyw!gnz^(F8s#$lV(?DB;FM .0뢱A<`|itO"SU`M|Ӹ3Fx5'7RehNP-Dk$̹7 F\t:U7>H~+\o0W3@]Dц𞰝V-ؿdQD&}{):,y4KH ILg R<伊QHRhfx5lȹ7<][0q\JSB 2JDL"'b$"!޷EqP%*ѽ(z5 8KŒv=<Ӟz\#*(kxWc)(bxH[P,(A#4d.v4pUX*KkԡJIh+#fpDp˒aWY\&B:\e)u W$\In}"vV&|C+{=/ə8h3N6y9NFny. P9w˧;Wz'+VEtzJ_9}| /l&`O o~`̝c7{(Hz"8 Ԍ)x^ijqշX(0AI =f 5NK9h!s4&3lyAABF6X%:fJgD<:tWU kȔ c(?#%IZwȞ*V8<!A`%ȴ✔TB͵v=0pXٖNxMwB .ыo/¥&}:+ UAwcquHVtA!hL-ĥQ s$K.eQk!WQכ+?좬S|isSÿ, w6 B\ўGH7DQq.Q$'o9$\ ߿s%E@#eL!cSJAiJJ\yF+Q#EE ШnW},'T.~'cOV֭f%i0! mpm Dh*DkDPeZ4/2GhKs-M(9{s#zVi@F@E1ӢDf0W6NliGEzLkcp! e.+~ ݌w%/c^?L\-Lhܖ)^GDw}t7ee]Y}0'*A`rv]U =u׊..!JM\sgᙪ;sx)Z~\fܭ>0O є3a.퟽A.~lE8d \|lpqOb%VbuOuݰnr`Xb`ż@]tmuu=~^G9`_AQ/'a QWMR,~󗡄qD/)lG,kmE6[[/XJ˜-өbRH~;e 'Nd(yw C:$xbD Zsa:;OJ߶QG~(xP)H@"It*9*7!E=FusQ_<ڙDYg#&Q˒6 ",鈶T8E &A nlNipT!+NZ^v\x{7Û~(N?tnq}y5Ed'qK.u~n52jK}<u/L|jt7"eFfjM^ԆsBKiKQ*t (UqJui}8s.}l[ SA\Ε$:UP8Ms'mbrd-NM8cF쩊D?F5ǐLHp&Zh0oWYXFJC,c,I V`lKVY)irԵy'uhveVB{>fܭ nyEz!(묥8 7&h7A9/xR4֙$0V(Q 52JAl`u4ĚݷI{ST)Ƅ((d٬+y+pUg׎V~S`Ʈh-,P/i7~M;+lڦݤe/i7)AV&C&֦mK%w̱,^ߝKox`ҺA[rвx=K %%BQI-$I2MTB8IIBK@uӤOd K4j/dNюJTx)Z2QǸ,,<҂F0a{1q6.=R~⺐Íו{nFVwV7 dCwĴ JM0MHj>_D]p 7B'>Dmj9nl8H<sR6Jfx@qqʹBNC0ё6\8scc I2&L$x sN_IxjZclB̬֔E<3]U;8"h)PLJ0d-^O I90hy8)_4"scsQ,ىOD*'vRiIL9,gp-PFlG֬mf-bPO>͗F_"ߨULTR&M8Pe)JE hAK5B $a^(d(o^QT=ML-W@-؅SP+mk߼}Å͟}ڦӾ6EZ{uӃssaѺ#,JrdcмiK"-ZDbr!jS(4"!M\27$ўz]tHT=*Osxzؗ/=?앭OE6"wMg<)P4#qFHa-<`\Z wKߜKE!B4>$`-0|+@#M4ΕE2Cpڧ$$q@PXH,˵ ֑ֈ#mk`5wjqx5L|!~as[+XL叫U`>6}H]9Ybjxllv~]۲BGf$sP 97oȯr]C3?w~"İֵ ϲ19`oy] Ž%[1T5ZM+B| _Mڻ \l`i%W^~GK$˔%%'@["<3 3M}O lB$>c1!ZNY&Rܩ/ u6f?﷝, Z2&D+<@[SJEL 2ʪ܌fUTzn8cw!rBp?_73Ļ7!ebtC^ڳq沷ƝKdɳ#V43pD?<ț7O~ʹBΘe_2%F|J}'cQ٩NG5 RU uUL%߃\ b؍W޺ǙsނV/UH:!oo;Rz&)*8|E$~8q}sy7)8 hgZea[R66 ;|M\U~Hn^4vJh> mX+"*ZD)k*  `aАNE0^ y6cLP-6Rg-׺6DB-ME Θ TJd"S=H?s?0 zq|Q5YJ ˃0e(/ :"b7KHف'r™*1Tl5mY]lUzvF%SZ9loEo%0:z.|B9p-_jMQ"[HM=fAys gVـH˶H[dz dꒊ7mp]Ⰲ^H)o$Υu" u^bk9ᵝj9O7 I1r\DYC=)Ne}V19a7g}N|{N@79pSU6Lv_۫9[E*/X|}(>w~c"=7F3n@ܢZy,*KWBx 2Ѱ(lH\3h%"u %ضfZs$KqdPD*$Pb)U1H*TMJkbl֌lbg ya]Z]G5l7\K&.}Q ab.jm(ڭ%b|L1!@SͦDL>9O#s$HR t2*zDhR@*5di/pDX#Fsև٬kP?ˊhbF5h5bq%.De~):Ye)¹;|ART#MIʜ+$AKYϸ༆( {D ϱ5b1r6kĻڡ^}Ua([ŵǵh.-;:P΂"8jA2i^q![&[x,wua}wӇ{PaXOs{FoIpIEE\xG xJрBLm?V LPRJ?6pք*:+@u[m*dA O $xHHD p3mVMx7y s+ * &N &4 &s){,h5T"=&b}e)Ζ9SޯltO~T\B΢K}er%^)\>(ޣ'%_o@OtD/ܔۤXhD I$0*Z`p;:\W,h)ÍNs^i OA*!Y2c{"ΩD{í !Eq矓Eg6솮];y r-iǙfChsQ\>eo}YQܺ.Xv*Ƶ]v>q̳Z`%l*l, O+g<=[:z֜Mt}roS<>csBs:[c-_>.q?BP?VI 5teВwX#+cC֦jagny:u {+&^~V /5 /lKyou0KvB5yʓɓ5k t䮭aCUh_t2-Ot.=\ztѹj\J`iWl]>;9[zw|ѷh8HKU6a!N6p/3P$'GA?C&n}Πm=.pB !̌ﳌ 8X¥&\*+y2 ,:{@c4wo=R^zljwn:mtV G3~m:y2{R_yh-=E'&kj9E%vvC.rfGxzg1*3-Iq9=u6 VBe{/>M@1'7tE_^d.q&Ra: lc;xW{?΍tpe6D~{>9_UyIVpYOM LO 㟷W\ËSP:KsLg7#Z>yS􃷋cb4BY1'~S{y5Y[nvQv\{^|q{0g8Gn5,2{ZG U;߻-auwnrGedI64WCQG8<@`s׷#_=vGz ;EջSɇN&n6N}F1uG?꯫Tq)v>jцh ˧qۧ+c-FZ&^/~fT5H$Ӭ g'3c/E CY[uHĈ@+þito)}٠N1Q N3h4$N%GE&0*O#49Ѵ#3\բcycL"eI "qMGR"*F!TCQKb8ٕS׾ػ_}bu/k&o)K\ʲj])O"u6N2i±RG,KQ**mBK5B$a^(d?x v=lwS;{/EB[SäWXJ ߱y̞Xy'vՒ\w~%2xwLW\P0Z("-^|A'Wx|4]V#~WϠ%io-"1`!)yx S&fH m胿]OJDnTy>>n=b[suY/Wfl_)a)Z8ւViIDF _?A_Uj1f%C&Cz)*.f_ad!TnOz9!8s#$lV(?DBi&u wK`a[x)G,>V ٲڽ<7%)el\Gsmni2\zpagMu8%Pɧ5)"RxMfЖ yXR.V)qTGLg:vd~lΩnGOSiL\؄V-hKgonOM7zt'$?h^mgmދ 9a۶qqt=Țsh\hq M]XRو_uIc9 #9[?lSlWSArI8(Jg!`QWrqèGo)+e&'J#>+M{>5r٠Rev)D2WLH DBAy5ۛ=ѯJlK^]zs}Sf~# /߿ΫUe"K>p\ԖIѲ:g|b΂'~gM>}iէZ2U<5梭#{is-wӵܮW߼FQY_y{]o߽Qqn{<0?s+ʱ{N:fcGx?ߟ3-03Vq[l ,@U6(- -f^2xǒ~f{۴C|.qRp$. 𺠄o$>-+U:Vāy>X^-'g$ &!pcK Gؔxʆ  빱1$ˋ&{o%H@xF$S7>\,/&ٰWJ#jVA ͱ2#N =F 3=9!gzrM"kk7ۖ|O(*U=?aD(L"2yW526*<PlP^'iwzRރ"zWttR*߇9,M+ '۞iRPz]q&IZ*Ɯ mg7Z$n0=1 4&e{((QM*GؖXE*qp<9TC $ȼNs4iPzsX:v|.Ej*PsfuUpaKKW#/%ūc >% 6JW(PZ%LJu019:Dh@g)豅$ngnA_EF8.̉\|]_LvunP]mܳk/A0>dz~>1wwMRl(aMM)6ٛepB@cVY!LcOKǾ/@%t$F 1(dfAerMTCQ#Fn߉VX6Deͺ:{܌\B:f=w#>GlYy)/~}gjA>MlKZJD@>1ӊBG!tbSMԍ頓u^H$#DlGJ"S&Y87QgLN jN'/<˱L}Y$ Wd[l2 Y'mDv! 19d m2x8Ewx|a|h0.ذbW^සs"ƀGؿ|9]Utтd^A.]GT:YP ڻ$x$HBsĊV$c>CF8Z4rE|(m+* ~R:'1M\lyՖ.)FA(hrS[#^.GUfq{\9e2LuzJFLI)+.;2!(lJd$ xdz Y =;)ѐD'HƓH2@Q1RH?m-B1^RN:eh@&Sẓj7OIgq}bQ_JX¾ |P"81 Ts"a1v0 Ujf?ZJuUgwwp{' ^M,;qb!؏}(PD!E JoOHQ,dU% W+OP-WOjєʈ2ocobWe:rUYw7qLtv]bޏڦ5"_㕰Bn?mԴ3}̧8kqx]C:c%W] N׵ ? _k`ӗwNk;Cxz̫ME:=DT[dcWIX=qMeGtq⁻Z,TdZ99]Xqq67EEs,x]xNp%}4H7OzCR!o {[.;?jg)[îN|7I2P`R&kJ@JZlۀ3n`H -w=uAoj7^J ^*3olg r-;cn[i{.D;aD>vɂo G~j9r0F{> )U do<:>ib,5CQ{%A2ʢ9PSN!5f[dY:$Ofٽމjߚ@?Qio6p-׳'8K;_դ*e`vUN3wgIj41hI1sΛĄ& ~7 X)d(ʎ$S%"tɤeTʠAM,iq]ҫn>B[D{ҡ{[PHh6q52ib@@1]*B!i;]^_}}]jv/g=>}DW?oGأ|Ѥx:g~9j{>]]ǟǓ5`)>eBs>X옯E5ݬǻmB6LTPc{vw" H8B}YG\hhܤ4:!OJ"u)&Չ&E0:8{ UY)QR Y҄B!f+X9-Aݶ2S:/ok-mEV٧`+U҉^s"{_.|>c}~g]ٌZ"Nr](OIafVPٕ cW_CLfТ͐Rgl(5^ƜrN%ج?3l6cƸE"k'E"LJe,)Iqf1~#dU W[2uNEyF>O"]pYx!S-EB'cm~A \>$-25ld ),3AATL%9sV>PjLK?qrw?%)\zR~|JS_fOmLMnճ[gK^MQyz\<RRD,ګH*zBCVX>)2Z bII~=_Y*>{㰄ģЗZiu@, RYw#c; YƾXc•_+3^L6?.e7;1ˋغ.F;G&-y%ì[aU eJKBؿSA5^PXTSRZUe :EA d*";z7Uw#vy.:Emh0`7Ye؎1t0XVZ,E,ɜ@k\)pmbNH`IdHl82F1a1(2;LbcLTL{/lESvK)[Q{Vp/CيX0`iJr䋥QO[h80Y̢sW:GajSqzotOYhaWՓRP"հmZMv')͓–6X;fF8jŽ14(]HBj,A%eT"h б$B:[L!G-e$ps>s :.C{Yr/7]1 #ZSǘ uI[(ށ9&gG ,u˹4>mvzO5ckW ZEd!s!@6eL T9 Fv^!n+P xeH*%P:!E}ڐER&8{7`:Ywvl"hkOn1 ?4 [-KlgvKMn׼W|%3X/FfQen%(eQؖd@ "Y GosLIl?Z>ԥCersF +חN"k?y զv{kcoPl74}a>F5Xͩ.oEY'x;TS\,y1(/"aj:Դ@b.9ou" 3[At %sRELb0!@; ifp%McjYN/g#<"@uNJ_oG8_jQN\]+"7t5oS)1\?{ƍ K/ڑp^Y6ٓ/ǥ­-)RPv1Jġ( ({RK@5n r ˼So-œf Pb#ޏ7CN5;oNrpGiJg%!׷t nFt7;Y nGxB)FpugQFvH2$OZDΐ@jPTsV%K(K\|Y۷`3՚1J3 N?YL5x- 8GWsgV.C)~޲|"RYnHq˲j48jMpϾ0!0dTA9_ fSeRŲtoB> )܄{gB=2!^:2)WGsI+BA$SA'*&p2``mQj!Pcc9s j'>|7>ؕ:&%֘ԭg\ć >yu,qRc z2gL LTN Z)pE hkG}[3_ޔy|c/?&vm"vfGz}[tosZ}ɯ|f띆5D(D Ux.x/L"3K2ڜy"3_,TBl~eϲ1`|ՐAM];)vk- 4ß|]Z'^&O.IZCh]׺.~]3Afź6\O]N{yߤ2{$QaLp%D:>p*AДȃ5d.b͢Ol~~@l(yӰqD[Y|y_[d}8/?FΞD0BJ!㒃Qʕ؍,8݃W\!a7Dszj=÷ 4Q@8✎ћ  NZhI2 TڦHzS-О $ ƜD\fZs&Q99S1;OR8' 00g }s6|ǟkddHX->6tj?U5Tو CC˫cES$ Qd+.3}ThK=tv{鄞y:'\&IDHHcW* U3M5G"ȓ NQӴ>wDJWs%dQRJ4"xۆz?** čcZq_!M[-?mʊg7hGr];3ڠ۴!׆ܦ ek#V~gKlvտ&Nӗ&JCvZ릡yZ_qWE\F%f|EHq;W\ǖ5m̷ow0pdkfBWz*7{VQӻꞻ2lBI"$7z6>Ajt@}^aXP_Գt۟WMeZ@2-v LEhMe &Rkb:W<\O`c,P[E aExjK>/]o 픷1z=&n&]lhha" 6UyݾX2%v ]wDڑ[,0xy&[,-f%b¹A;SCR9 e)ZQt9eRJFu zt1&N(䣧OD1sc`XW^O?%'H!T"9^y\(aKVƞ|'GىE>:1w 9$j!<'Nrs+ΑG$p)˙q2hbXd1q:JU#%8_v\>}lVOG= q Dd@p )Mh(JR(‚LI^y(o޾Mc|~`+,v%?~7OdhB$}02lgO] Zb4.5'hN|g׫@UsFx ̌;N|EԆj:Fu Tn4| ?'##qU2-(ѶkØʹ׉:KFgY;p8 s=,&W:(ᕈ٨m;b,0a B)}QJOq.6;vEmQE=4h q -21 p[TThB2G ɔW tY< H*BFfHZ(F3A!qD # hT#r8ac/|20 "",G7xG`DRIQjZLh4 #b1qp=I=b>O{fɮ qōǵ)gQJ=9X8EJrHBМq)%,S=.>. )f7<|%~%{OR=I4g4Ch$SdMZbug XϾsƒNK>: m:v/A.K1rfxh!1y/+mH@_bc3R՗!I$Xo?BcTHʲwUoJCQH$CfwuOꮪt鋚>-z3c (2^lQض %gf@n,/3Rs9A1s^ք5RqJ TK&()|{Eq6pք2:K@tX+Jd ~mqt *Gw\,S2-:Qzc>$P$s:Vw e1q]6n80/dVH5 ix#1r68K46P6}@Ń#t7І :~8+kՉ;w罧B9Cq׆K|:foF8k')mn0GuOPW n&EYW/Ioiڶk~P R tp\"XK"J-ZD)k׶vH2L9,f5:h8% 0]Бgp1Qa0"i6Y"~ZSg  4q0wItOZĜTȬӮdȹ]+u[8:JerS`tˉ.A$DD@d84KKJ  L.U\&t6]3N-]O]7'2'r-iþGf G=8ߠ煖aot4&=gT{>>AC/gQpX04ףzcXU*x2 fa>?2 REL 2ʕ_Nl uѠ7yMȝ8o7!Xy?ۺoF)2Yq}+dqWE6q06:4%9O!֏Yd3v81⢃▆X%ЪMZ%U:Hak?'N :9W  #S87#d2ŋ0ɷ>ѼQ87kAZea[Dm"ikm*gvyMq :ti N`3YKT(`o":IP&sJFƆTmgݳΝi?|np^phA Nk+8nOv;^0KA׍mfnT(x#pᷭm/ꩳdiaˢT$\?.{{.ۃViќ5Ќ4G"^Z|/uCٌ&8iJFA<$2bFXp\ ouho${;A"I7dlAFlŅ1|e_q1:0KͩMT:\i(%qYt&;0.G|ޓGDH{]j$B6CTQs?Ej\{ ÙE@S+e x+g^RG>!fDA"WB osP)c$pPp6V;E!2ay Mtw2%t W~ؿ0nbMp\~Y)n,h/κܤ@+/$_渧:1 A(K IB|Y :ͺ )R'J3Uk 6 ъR- B\d P o/vU+B=I̴SwUl[X7a|+LB7H_Uecx&{`Q嫼3,~\r>_yn"P8ykc[jtaPN9Ks.6#ZmkwNj:1]Y1#~Rsz6^[nkQrTlq3 # ꇁa b~p*fKuLz8sQY?tuPT~Q!Ne$,}W*=E|qMazt*'ޏ;Ə ߹8Cv淟ߝ=~No?| ew?~V\Q9&_#@ÛjEEPK`_ !Ar46q JP\Xd%pJ^kna`R_IYBwc8Ϣp6d$gUw*Ԙܒײ֟LjQ\|s5^`*HPYӒDJ J)yJk3Sm]֕!<,&MaPz jh3>%XX*]er(wK>t>M>j_8i PٸF?B>?MV X/AI"nMBKo()s^ 0i3)(n5ɗ`oHY#T$FVG#PXiSҞ-U1%R Kpc.7⬯@siT2{V5_Wfb\ޯh=-y1{*.!/GջSU.$K hԥ֕L,u 4\,x.#b]H9R )v…T!V$&*5KJ\" &%~"CLVxX2QjEuBHv]OK "8tդ+`$S1r65˻9Y=˛qa>+Qw ;!ߐ5jvs c. "iv*Gm!齉ٗXLXLam,3n/G8-zY o{9jʯ;ą슎Iv2.dz8IeS\$~\\ooS2a_U77, *~"f'PF@DjyTh󱟂 ML{2E]!]WWJZu Օ}BW`mF]!}QWߝT٪gczf{2썺Br3e;2:Q]iH]!ѽQW\7g?Zv]]e*5i3TWz{`boU&W}QWZ-v]]!V]=GuesuWqY=)bP\0%>>ȱgrمp^9 ZEUO?>tmJ1]rC hBKG Cna!!D /%,>ԤqaA?zEu:7A@gUye\QJ@*2 IL#͗ZH3e!b3G5ܤk%-Y4lƚ.a]]S46[o~+Yo^ɿ~bz\q{U٫8z!1g*DaQk ̺hl;ϥ0ƭ}K~boNU7.&!\W5*&Z*UPRzܠ&4f#?1CYyU7'Sj\B5R]pQjax VJd)UmU*ǎV@[8H<2!End u#N:@ԩ\=:aY)U\8scc I2& O5Zk)ÔT'pJUcܞR\_I]4τEz|tUmZS#[ϓG . u)q؝leF.l96nԪ-/ߍJ#vk _ã_!G7~Z~OׄH]N%DKǙ-%%LR|~g56Rs2'RN,3IiIG1YΤLۜdu>?{Ƒl^H~0.p`7`B?%%)+NHICRdS `4{jzOY B,YydcOL.95Y,-;J6 L?;pq .[vk] p+ԜL9k"*DACE ${ngI9wk&q<.ɛe6gi:Fq_nNNNkؕjBRE.fH}mB{{ GϡHc{.ҡ_7mQL (g 1 `MaPHzE1@WĄxґ'jQ:&Lvz:}vKJN ?-~zy^d)O401i., 1h&Sa4"TJ!~T"0,Xzɂ<7Lzy 18N_9^XF(F>*)F(\*Mb0`Ô2yrVItD\SYo/SɺJ-:#qHdhPO"E-, 4L!3 &papS9waڱ ȴȻ=[`!K[ݢjnsw*b|^8WUiRɏL( Tx))Fq<i4w"2Z ~'HblB zwMb|3`4qU2b!̕a5Q¨}Io,pM6LӾ`UL4U0Î&qx{P#v/Lڥ{MzLIaLm ][Fܽ%=\,im\ꨮx_byQ%iW=6,˵nLas]=1l%{Duڙw]o +]vTU\#j6xLk*q?>[+q]JڽV_ߗ[bf~w37CT6'!Fa9mT,uQ\ |mRlI̡XjQ)a5 ϣ`xV|}3t;}5Nq`Au\o>Nk٧.hITf4UcgՂV~`.G|EsW/.ükRjS v8r!NQ-xbEn.f^ZSR.Z ;nJX0Z.GCTR7#qC̼)G8H;OGbȶQQ cK+oIdwa`uWO/l3Fh9n ңmoFI髬4!ۊ>M~w~D$#o8VnSތفejW5.d=.;46pO*d:<8X Fr5Դڬދ:6~kVU]$_|\swܰɺׯMBpe]]yuz.X ETE6BDO S t,`i%\pZp}sgtvd! '! wOLx,BF uXyb$`%^G$1Z4H`Q\zY@/i,o&{_6m6Xע9R'`$532£ǪW\jd<,Зxcr7C|ZS_h0ʄK#JǤښ/'K],cHdPGW#CQ}|}vk1K2L{6^8jicq.^ AHR@1o䝱Vc&@=豉hj4B o6q_ лlob/2 Da+P>!Z:)eQ!4ȂB$7."C I 2̀]QQq!,LI(J1,,VJ\扚M5TXhi3Mu&Jͦ'Jͧ(ǡ[~}wXA'NVh7ت* z֪v?R"+*oI488Lo%#,QAT ֈHa $mVdJFHq;crK93@*72fg32*ٰ0f슅43 I$f/VmEw37κnPp|;9b[ TE0Xɂl+uX)BTDZJY`_2eMPE#aHƞQk % 6N0+D{CJGt`ZD"rR>-8ێG~<Φ6ˌڬGx6Z=:rM5A5fbD: DX̋ iU3bV4(k|@N@@ c}90nxMxXmEcAl+"̈{Dqd#;A*q"ZK" 1V"Juc yOi *{͎uS/ gN s`I&9bˌ⠗e,⬍?&_g6-ef\=.vV!E0!1:@H'r'YB4EZQJ{ 0i-rFN$gGTt+8G4GQ##]i٬.VX_=y Np6NFSo2`K4؁!Uš[cEU3꘵{=׬.{U63_RSGlݳ-R+.s9IY-aL UZmp$TZWYo7gIύ_T#52`*tkЦ骭ݶv*Ճ?Lڂ {6uV4o_o֝%q+'a{EWڇLG|W[iU{fce ,.ӓ8.Dhb֮! ,3H=Nyf!TJiV 죐"0WaCDFRYd= .U:z ^ wE%===+Z!(ZUV01<:p3')fsy9DŽRn&x9)z+( -J))$1wF6q6{1yvL|Ԕ?CtrGB !0ITRcKFc|#Ԇj{V>`ztD8驧X[뤒DqgQ$jX%:S x¹ [ٳܟU6솮Ux Vؽ$]91m>) b[d{`j]֭3+I@Íc}]G=/\xMP6zoUqez6o؛p=-#ɰؤMw)p\)O8x rmSmKlwz\+mɌ6>)Dc7?UEZՇd|r>I'j䥊Aڕ߇mϴ^ii^S&LA\ .[ 9$-ǞdO7 DCmڶKIӸv:8V곗;T2e\"~JkLLE$VT*: ^帼U}*nY멭$ѱYi!nti}rk Sdyc e & %aJ+&9NxQ>?$\\M,^:B{\^㢕ZOs-k#|_Q(6J0ϖSWb,cs8zπXs݃Dd{_*wn*m Y+n#a>5%PNJ{WF/;l<dw lv"D4bdD[b[jYdVOY<: 0ý%*DJxzB31RHp;m<8v(Cd(\z74CB*/wɠAY6:#? İR7&-_ڮnJwy,;6FPxԓ\UqOub$:A3DQq.Q$'o94I񹹎oլQ2xddJ <(TM)XHڀF+JjDQr5@|({> 'eq~ʖ9.eVm?>&!zpAIIpG"BHT:uQFugmhA뛥鹖m h9%8­Y`UPѼw-H4ڬ&%P$F*.u=Åt6&ĖtXubfϧf3fnڝ00>7C|9"rG޷z=8$$?aط>S [< LԱj~} ZK>995{pF5:98E%v'gJrN?av۳A,vL56%pf%o]g#1l 0f[\^Y,>lt~C˶[;o[}[dpdN_f躿Yq*I:2g~Oa/*g]?opEwT=s@}n-i9̈/p?W~/]Mx?o=F++f\n/cpuW;NBVK[UͰf'GTp(敼Nzxf=Z9+[ed}NjuU_ E: q#aaϷ"~V3w wM%Ήo/|/Ň?!?뗋w>^P/_Q9x@ Ax#6Ms [4-X^O|vE.h%vh~ 7uEn%uݟtjWFWnhWPlTqz[{[RT\_B4ne=chM<_}-SU/tf0dB򄳘4тFcAM4x 4F)Im42ƒTXГ?8%hN6:ADݮ;t^6B>Tft5+T%`%qg$TP QRpQ)LL [Mrp⦢ igRiP,9X@<&ݦ=)Z~^D.-٬KǙ >J];Yjxq7,)&v@J1ތڝq~Ph m7ٓuA$׍wկIi1aḣ1.R\6&a- ^չ|*.+?~毭v&шieW ";ןnC6!x; )A KH!Up@V^O)?*/V>mt(ʈ(|j|ap{^oV 7V 3!E]yiqLAT[-˧@(EőG∢(*DK$&*8IIBK@uӤwd KQdNюFl opJap"'ەťTr@=+K)T sh!  ,?=J~pԴWW}HNP` d*+ũUVc+ޠWWJJ~2peT*K+UR5}^#\i8% fԜ \eq \eiџKD)9iF2g WY`OP\\N*K)ы3Bw^ Lo<-b-{v.Υkv\zG޺U oSk*@+Mg#|;szͽ٣йG4ֿ8yӚKyFgUgx>xR)`h.ks#ΔQ g#p_j5oFzb:d3ӛ#jI{V>Y`gFk&69-iHRƣy6cLP-3hX+N=3w< ?X>erΘ/tFʀɘܱt m7.M~DGXGBHI:V6g+eX[Kk% 2D` E3u: P)1Z=9~Wθ[&-|i,lvg98B|Z~,,ÓaXS@"}()=4~OjqR*~\cc+n|}?ʋ㰲8W%yY*𧓞80 óS3Fs 8MI1r\DZC&lJRBwr!(\Xύ$!\T&H@xFTG)q6#ⷫ!.rS,Ua\ .6vVͥEsGYBG 0N`$LZ4WqH1@O; .cbcW<ԅPOaUnx]' ~4qIpMa)˗\AH5GoR`WE $B@Y脯BU:#:&&i!'BRrlO"NO YeJFJ&)z߰#%PCͭX691q]6Ik80(3Y`jF=Aa1-bl9pWAe==ج_~OyVkqZSq|1:H31Jp ϩO3S9^*H\@D 6:hf(Zz T$Τ uQ娀>F596,&Ly}~f)_Pb -#ٿ2$X A /"7KO1E2G^DcivWTU]2a  A}ĦdiJĨ6xΨjgoo$Ol}KTGRaDJ<0fN T r'h( F]AO 7R{JG~F(`\csFϭ1>בȴ򉖟tA=N6igٶwijX??g[yvo<}Ô]*zy ,:x"tTU)=ĢgWxTbKK^h9!_-ݹ~$7l`tS'hl'T a,{BN ,ϯl|>cy)]#ދ05]05j$kՔlwxEgLy#5|zObF#faR_pQ%!*8Zs wF9A Fb 7 !$ 5 *_t EΪxp$=;zkwۆvL^b:yUj?nAh(7ͫͻ EpT*kkL]RAI:6 'ұ>a,{"rAj"V>hֳd>Ժ[I\Sb~1Ԛ9##N`#w3WTiС*#nZ S0!S&J;fR@'\K߭GoYkYKz+Z{drZ&#pؑLk49?ix-3+|SbтaF%;[ :\5ͺ! : &p k/N$2R0HFǤ"9 TFJb,"BDutע踈?6V`Q ӌȩ#[38--u'ljJFR엋JNﯬG9c)0)K.ik@`Jf++. )4T9Le{Ggdf_Ks/Փ_]/Nׅm`| ̥y7:[Wgf(Tx5/SN/xy0!԰%q%o^iҼ k^fY^!0< N0b'A51ո*AW]liզj oj<aA#iadW% lR<L wǤpU5p\=pz^ ~~W_ 0QӯO/ n&3 ~nkiXo4U쀥9is.os nZKvwkI@e?^j.{^ziukRX23_|2rE)J|uK_匁 q;A.\"޽_1vH{]fKT>1c(r Dž0 n49>AF=iR;).~L_?i+uP9 3 ߃ 2Z={J W90rʩ S:yI=:@Ӝ cMtZFؐ(Spv33R#'TW;opb04o.W?\XwuEyjum)zx+.[qQ%$lٔ_z 4gFE7&:pMewG] =#V@DSd*CleE xnHr"e&x9LjE@ a5-k%E"IrbkC#/OώE{)" O+SJ-@uwSE[m;O7mʑ3MT΂#"᧠t^Sn>yCx XhZSFM d,DhJa u/b"{&CD{SDcKCMDK9 Qj-6m5p+gDih2~=΀Xhf\jsK|f멭j_*aH BD(`(ȐRGX=73ќ?"EGol~$6z7eHh>dOG Z;-!nܿA1c)~_aNdKJ)7E,Q9Up.iBNu'GKg*`݈'v($fK0NLpTˎNHzE1 &&YJvcJuR:&QNT^hhwKt;8zzX^o8{u8~/"_?2VBSBj!F5 ah,ݝTtߎ#܎#$\çpGDp{\\`-p Q } EE`trG3bH;p34!OM8.Lwl%q Vc|9 v!]`%DB22H 1vd6id=8O4<`/>CG^*OJ! ?*gc =K$yn(Fڝadq<2xT)P|TSpQ25c S䉒Y%%qM꨾=?GD%n 4{ц}@}PԶf7?Ȗ6x"z /RwGƕFNJR2Pka18L(X!UF uj-J06J0K0Yf)%g(hx)MRF n(ʾ^լ ^#+٣Jc\ lƕU9N뜃\yگ#]^&J򅃯Ȥ/K3-"C_G3-?0dv~z8 gƮ$˟)=.ul2-c@+cمΦ, },tp@Ϣ8 /d6 `'Gpdᴜ&󉛌/¼MH%qFչٛtPzig9Kg_`xi "ꗗUVB߯n<-~ٯaM)g_~%TSŜ9s4ʥX:q@x5G웫Y7t:Lz7ߟk N.Y|\TWzG32N?mG9Onga󚸔ì.v#8r N)kqzOR 43xrޖVx m(AXX`ăz kH\IRerWTvgp*s 0t2>]PNY%,ۭRWv IbXR^fINh=SҍcW%˃--|۱?VG1>6ev,)nCBO@+S=s#U=ܔ`5&lE)@gjt=wD6z0l\sst:K eɲ{&10gLwCQ*)=2w,P)>w5Ĭm>:mm;5U>gt9v7 E@WE6BDO S )cµVwvLő/%Z 9:Ĝ/BI-<1(X E#-H $Hscz0.GU5ܸU=$#[ur'\I].#RrK}5de0co޵q$e p1#!K|ccP$CRA!D98X]]W5/q nPTqr\Oj:`ctggEֿz"t>}?ЂwZnpgerB)c#QvuƳ Jsm3@·{sٍb"T1R0;,u"Ѧh)l.ZZبNZq]ɵJB="H!K̵(єuP`;W(n}Jf v l,z9p% Ft):W YdZpR**pJ5;ܹ#p Bq \\=\WڒGV W( 5+Vѽ]•!)1(W(:K+k5yRWpSW;UW e}pR҃+Ыcc 8'ɱ/)]|gk%w(jx[/|774tx̵b6 6oⴸ#xS6Iz4&mtźFWrcz~o2-SAXx]d|o-#L/U懵Zn!6 ~Q'5X,.qJ.&W4NuG3f>W]; Zj7xU 6 VF4rU@,v[1<%xzS~_Q Ӷٓ䕯jyTFȂ>mRDKI.XJ`-m@ ia9-m4>{9'ڤ}X=c[ lЌr+Km͆f[z AσMj zcWv^V;UvlqUF6ge[_6R&QoR{?w[V;'=OXo{QKLj)` s2Y&,WԱ<^aW(!Dt<I[WT~qJ;3_Q\t߿:WݡW8D#6=ӱZַr:XvR AθYz٤6G0^֌c9o۾O;VQ͹I}__6Q[iPq+aTr{}\5v^I~Ye]5 K;hHPٍ\r5\zd\eq>}ZiE'Ɣ}r>'oHT{-AiTS"B4*MLZ&Qm/ V.9A`3AWv!ڽ߆R}KPwp>6ICp+f:W(]+֨}+RsW*"uP\fW(W(zp1v@`Ҍ)TPJvI0nD lxg N ҽ]^"\YT|W ݁+J+Ji+c9!o >L`4&1 W>I{rN+}^6}W*v^¨woЛXxj 7+ L}>.ͨy,F7Yd?g68L<&6A[vr*<% hL''E K" u8߼ۿ:YDܝȚҞm|~`rzI|j:wQ"JKeR2cdq@8KOBY# ~'1ntO78G`5,9ڢ3\qtzU"|0{(ѕ/A]ك+v,yWb_,MJoЮMD g?G@iu$4`JI*<APaYP|.lO砌?BN]?)DEJ9k({Jq#Un^pc ׏ ?秪K;TA\B냋~%fQÝ=?ޏ/ރ--$[Ht{؟SeʸP4CYo(EIa+''*B^IH6g&URΨCR/l-]VkX*I*c|2 8&%h ^əcR\JJff)E/@ ($F39RYo h3zR5bպr/d/ Gt z24'37\Qz10{ӵGݎz#TWtUbwT %mNPq!̛[F3? w:'\c޼?t~׳xZvL9LS~wfe~DML;`EU@:P˹Ȇaf)a)'>j´t!IN$H f@!l8.)c0I4#2z -OY?Q' Վ2M)wsO' *kL Syݒכ lS*m9 0V9$4j//luGQZ,iGSR1egI%bYF|&ʤ8(xUdTF)cfzR0HٗY8;Jp@mdlMWi [ӌm쀅 m d'EoV_jί(]?4_9b(\%(XhAqU6D c&kNmW #Dlnx hqɕJx%f6 163a*IЦenM퇃01[ӎmQ[{+qYZneĤ ZoP53IM@HTRdʫ,tx.x# !`E&&K0 .8$Q t˵+~9ǽ[ӏmQDOFF%ڦcys$L)iʲׄs:qs%PȫҊ53dY? _p-j^&6ZTm;g+,7:^e_Ga)"NwYPt:ej1?0  %,hڟ4K3 <{@"=m^*`eeLsWiZ|vj`ZTb,)xO)cG#x3ʻAو\VQs<лA*Q+%D(W3j!E"L^RܖM}i)n#?B>N $@HJ-r{@ȭdi?2V9[D C \p.X gɴ,fU8t+q`y i!HScr(傩LU 962nM !<E=m=~}rIä>r}) j;]1~|t1<% ~8.:dYÇbU'!Q IL\NDqS0R``zIH)OR8NPX˓5qO_`r꾂Ɍ)soU|͇viyy!ptz>~ltzf 얃<+%BKֺ:R/چRO܂ʵI\ZgL)Oƕ^ )LSv0\l}ˌ[־utƍ$0Ф( VI`IOJcLRPvwPJ،Xbf ~ ڨ<Q8'11o{K['vi6`5ߥI V`uH&@E?%GG\IR=;b@"mڙ%mrЁ%’VȫL=SY>kJxMϊAxJiy,<@M|xmP4h֤eNl#h"&<a U[k4q%E[EVk܈\/X|6=R8W(S*˳4(&TYͷ+wo;GV }=6q(][B_pFy̥Y$-ܘ?UTl{dK]+,l njUYCfdxLT+adKI/JՕ(c{zIKT0`8NJOR34jD R8K<+I^=⩜⩝ɜk#{JS"rx{1( t9KA_S4ow{q.]azR)=: -Foe202^&<њ8Qh[n_٤{^N@3Gx'g eLR{&KMq#'8jۮ 4Rp!X'yY.Q0=ُ`z9m4#n%c$̰lQQ?9l'2y*vV!R]ߛ;E2ΖgJ/ yCncI^>$֎R bunI[d3c)Ȯ*cK.{.HZ9r4QMBNWeȀ.& ѡua9rK6p`7f!̡ݡ畖a2motUrλNyij|KބQƿ Y'bkΧ?om~nse4V?H1]DW[g?mho~`. X+VI6B `RG93Z@3#lQNr댴{ʵ8cp[DC.w/w{>sBEp898ߵv-Bf!AZ!پnAfSێC;Nڨ(CH sʙ2.OW 5&]3z$z,:Oc0l]a Ǟ QԽEj5Ah S6F"C?l7P|쵲2aDB!X&HSE57V"FMsFV;|||&9rv>p.h·Ûmg< C80?ހAT(MR!gs1w Jձـg,E(&wU#sGyףGLqM Rk挰8-94b^YF4uIQR!0B00p) r?Ξ"ȟy""v0&W1V` )D0.S3esꅡg@0Mי4lg2p7,3,4E&1vd@d&+}MHU}3m5mjI3q,f:f蹝Zd1ITx_BsC붆CSv6g]/.60mQ+[beboO_AROtZ`J17M˝]J(Momu6 c&w][ErCI*Sz8FCzvh1T{)Vq8`Stg(=\qglO;1De 29DxeS3I7>I=.vZ)z|(U[U١mV^ g^xVыƤ7az&dG6G6ȕLK` *0m0a3% Ƙct2QO{ɱ6$Qs-9˝R 3]߂;a?_d*u K?,a+,g%bo%LmYu#,ڨ es,h!Z:0ե+h!JBTHq7br FbJ !D!݂{ rfr?Ұs'5]Mv{]+|n6+K4^]]v#7KY5Gt`7 0$'B(j p [`1`-Xh ʌ >1:ۦQ1,%! &-dZ.w )cqlrY<;ZnʇpM1˞lWt3YJa~J378c &$hP萫yNѰq6љ{2!LH~@&$?R ̂$BF8(Op"Z @u4NRQ;0@>0,܃b0-j0#4'(xAc5rvj5ˇ9Erae>7Vo6}*2\Xb^)a-?u0 6u3!Y1^}J{(rRS0;[AT6U\=r>wL"uʛvmǁvD$eBE@MDSd*Cle f<7+Jyn~! Ym([Vɟqf)Kr|t fJ!濼RN40wрm|r -%&f?}>㷼Q"=zf{8 +MTES+1Eb JGh(e&Dl]鋭",,Yydch1E4^J qbi1g;J'bFXaN^*>4l->iP;^B>Nf&[;eВ!5` Ů܁o;F0#GNhɁ\XTz8yR.S,3g19+әՖRj#lܙ%*1r#Υ!Rȩ\Gg(fJ<MLhTn {$Gji i#"UmJdٓYsf~O6ʫ϶;KK=FIl%4E D,&,hc(xM0hF0K`wgj)N#/,D|v(pQ**#EK4Br\cE!\#۝r_Xo; p4\X|(ɻAwwVh"xRbXd.Մ$&1L)#'J*gHG5e+Wº-dvWpH`hQEm[v@hDPIFyݧ/\+m)#&d#֡bh$k Q"5.C #j=ޱmwr;ʦ9^ t䡔^(eW'3iUE(,i6yvV$+4ƧOINSyfhRE+35|9X<ߣ̸*{XzAOD8[}PUg>mto:O.8ƕ*y,xviE #ظdz>=Yv4?QܴR(Y~0: D'ȍiwazԍD%quó=1zzyZK,PT?Ίf4s/M,5Br&׏?`Eֿ͛O/*؜,̡kQ.a5@t0<;TqHgď O]?iuѳL G'*)mQXR~aSE柼&.0 I\S_TM9=MGq"~43x|BGà m(q=XPz +i%Rg( ".-ޫ3E*"A|/>QnW8U^v2coo@e48 O0ERf%6S]:{~l\:?Wo\.C9Sƥd<y3$ccqDHRM.DJE=I.+E™ԔJ+~43>u%UN]Bu%8~ڪ%G\M] X*)!zJEL=uXUVCWW@%C1Օ2~D fHJJ2}*Q)t^I}L ٻ߶rA`G^@E>m_~c8k;M{h4Id[轱 '鐼s(2,_R2ȘC//MA=2C8tu43JNtء +bJJX ]mӕ=DWφlĤJ /떃'ѕqt(DWϑt~l ?аW5,2G,~|Ӫ7JKB{[*tso(ജـ˴uh *J>>4ɋ;NKG^}p;]*CO)DWL.ahϝ6'z>teTЕ\zZK ěA[o.^OusřĹI]~?=>$lspQzw_}s{];?^}-M?=p)Hc [DN~ُπ}hsgE}>!/{WM8ޞ$\o;ޯ6<mLJQ8ZGD~to_ܞy>J߱ z+n{FJ;ѝ[~9{O$_Fboxgr˳WIo~xlu}kSmM=줄oξf|~B~%YK.,F)\Kj6k5E9e'U bﴢ O_iEH67vA$; nK6ŹOtNO%K,,nK+M4;IrnUdDW Xd1tCW֛ӕ9DWχ"z܂ bJẴR!̝s+Ή_]o~΄tpb(4{?@鴬9Uј%+i1tpe1 hJQЕ5zk^b<*C;st: s0G]2ydjs퉮3^]p>~~.~%^?|qqYM.Fw/ coи|Bl/,۫߬Y6 gw|,&'"K|۷Rn?~nn{mԞL٦5ggǑZ~ۗ;lrF}*j /(_WGt˞ЃNj.Ҭ[럲z,Cjvz=5[(ҩ@5$S)w}Onn!Y<o >|7XmʼZmorvd$%2JnmQ\lcIn>vg |xɅI3-HB9AmKi6gÅxDB1v$:uh$ח;qA 5}0 cn#%2;l0b!ϽU93:EIX0%{jHa\iE3ZhGKņhѵRr|h aBR틥1XDnv]=r8wcFJICKAh#1@fhkHW(:&\M4jلbǞឈ&uޯnyo{L-"]**"@~DF@4ن06[*e`0&!R;Y]ncH599j0H"RLQ"OpOЈO&#IMKywkuDɕl-5PY(/`TR (|c{ἹiVU\s& ۘ`M(DmMeDH\Xs֭Ol(-Bd|@nҺX]Nش*&tsN<\lA%E&tALڳdhXh,Bu;a:59^< E+UrL9%X@O,TTPtԡ-1<hvCd\4 J3L@2hZ-#[øj*.t-#gGiulGCVpT+U;Eiژ R~9ԠmETc-9V G)&S'7g\GZD*1+wOl$`6p eW2)G"ť J}7ii[-PHs o&Ě|+"aGqH HmL)ڽ,ZJq[S!Վ a: A #T_c+P6I~R3 LjKpAKq޺lѳqM5gk@(xlj(uwtkVfIdG0PFh O?>iRF0 9o9j?T3FBsuI*WsHwWQza2mMUڃ,2oڪf+ f"[RB1[&OR00 ysA0iYX-κ|`|rED`&',oԱ pD$lwMX",]e*QU]!މh2eӋT0rqh43m)tK)O Y ba 0]ݥBpzk`r5ByxBC3X_LJ/Csb`p=䢷Hn0(7_F_{[:[9.~LYv{z@Jwޥe_Ft9 o%pt=iii67-Mȍ]Kb/ֿjS7qF}fonЇ{lu89O@߸ =TʆߨCT!T.@vf+ d ^'Ob4VRI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $~X fuH wh " TI"ڦHCRy}/@ `Tew%W΀K '.)%Dp=KQk?(ϝn{Q↠zϠSL+Gρ7=?Spe\]@Η~6с6?M^2?Vw{B)'t\Mo/8co?çż̷PT_y@7M!ix쉄F@a\h Q%JxSۢp*mOvh ". [U"a8λMw+ꊹ*Ҳ߱H)JiɈ쐹*sr"bW'hޥ!:d;K승*{4WCYUXΘ"]1W '_H whv\R;UWuƻ*ړ_j)5Es+gg(_9ky#g\v&.sb}~˃ )|m!5 )cD3e%t4H{gc"!m%~U_>C\S1_y>4mԒk'Z&ÜL|JhK5ߣeEkAXʜD){$*r#ނM0ySj|ʯkՓ-ͤF8bnJYmc`kk}Q<\˖+ׂZ7,asL7HRF r=Z5xݏ+&L:á{o!*/ݮ!{e[mkyԟ;ݴgCh Hq.O@_A`qIws"a=: =>ֻz|՛#Xaul ~38?^ JS"rxc1(Z0O`TWr9?^;ag_b+/JVh}s-&Yjfϝ]pl25hp}\{]Ů媝󍷴ƛ)hx6phr4/rB_ubL^lS$E&Л`ꞮMEJ}~H.Cz,겖agOL?#gw O- O#Ⴏ C'ųgr>Bg> p ?~x䠂z׀yr )E ȧ{0 k+:M53%K!pf*b|IHK > ;R)TH:56CVgcz [M)0W35E5gҵD-;ʜQ(]]j~[~ ۧjb1W*f!pmus !͢rc!M&:@Q/@&LZ/R}&Mo_8>%ŧ(\ <$}/:߯m!p^0k^Ƀ%?ƃzrr;7bZ<(]_mlrqlGKeaw4]Ӎ=q+!nUq)h`x3X e0f!ewa>(ܪ&@-c5qv[zJ5[XM3-dh _$*6."f0L7w A:-qU2BY"(!&hD۔ue'$s$L)iʲ/țT{8AR"؅E&nxsQ{f1Nijue.]ÕwDYRFO-u049!hθ)g؇oH.ֱbiǡT0{ &l|t[j9rj`'^/xs%0?:ߑ'=y;2gB+6>(ڄ=0HُBG߆3w7GF1ïf\!E"I^Fm7*A!_'x"$%yPR"AOeG>* c2sZL>K`bVpR{!Vg"ĻP~{BYD$_0/)SPՓJx|W÷ !!>(iʋS\L3 q`P+s2p2 {NbE*(t$XYa}@%a qdox="YvPs&AӒ&~\FǚOC}إᷧ}ԴCsO8`;1K`zUJ 룢h"694R~ +e#hi ɸ+!Is0p}IV [/oVh#-F*N1IQ1Hx>e*]1I9wԥ_(JČXbf Qy "d"TĜ76RKݎ#gkrpb;Kq ֺn%xcȸ oZ \#399wt$H ȫyGT){Oy3^%m^#oB-*ݹz7ez8}:E6*/M.뿽Aǽa~` %%&.CtfC?B8KMQc:(2FXBsmr6kˤ\?Kzmv֪¼<wsWbj,zNnmȦG?6G7 yE6{5 .Oq0֝M ,e_8/g779htuAoͶA83nG-ߝ3{vv}F[[ >:Wu{sKWZr+kw]]d|EM4pm䆱L>_RQ|X~5CvA9xiC2zX=oU}EZ*WaZz; jȪi &'Bim6xEG8/`bms/f%~2ET5*BS6jt"Ѧh)#p/{v( 뎽~9jrUjxkFN5++ ]ڞ}-+La8h\(GuN.&)Tlg2PF,7J5eekZ5 )(19甇37+?=^mǖ7dZ{8_&;RDq6M w?$OkjHe'3C4)%$ΰUէN  ɊNt}:\&n.'eG\ z5L/IQ(QBGgcЀLVr]LiHQ}1khz1Z)gKNlnYwu Li28<]6]CRr<͖TUmFܮzCy"8,7|q?;Ɩ](Q+~;'禘 /B 5Qkb.!հj5(ŰQ0iJ>,f=џSY'jm\_I:`3IscY?‡8V' wEdmUXg?:qǛ^^|z߼|DN=Xu#0 .E55᭛v [4V:yu[UC}bkTI|zy]n%thn Dio ]1O:t^5+'Еu3? 4#O2WQ_|—&Dxh~Nֿ1VHhbb[e;JP8+  ES{g +SkX)SNMН;3>=Bxoӓ(#bX#6$4a >`g0ZI"w ;i`g}5xt;ͫ=(;{!׉+v^05Q2gHZsJh5TGѧ)}jQ%!*8Zs wF9#v1 BQ`[ݖnݿnB)n;?ԍQ4;ks`zg{6w.b5K j#WB:XXyKjJ\V5i#8bF1eeFmT &+dZ֚s߉>=N \uWG޹*nbl+:<[,KSWq5OG8vzՂ I"T9#:*xSg4,\$!w.\H.$; …> F(B2j@=.I!8J'C4ܩH֭P (y̝v 8kPxf\5aeZs֠>A̓]Xύ0ǫG.gkdOaN>v0 VeS,>^sXZ,UTA$,Q9Up.Ǒ R=sГ\=K2g5t03 s=zF[ M#‚0rc*nqwVPasEh"$YvK1 8#Tj 2U.F.Ji8jcE! DQ!!X2\B#!"}"ahpPNJ7bi1g;Ja8Ҳah9kYo; p5+;_Vv_ܢtwf]NS|SG6RbXd.Qh%JJ#2i}{Z{.C]Q=b ^jÈ'_]<')eq1%;P1bBJ&<j-,Ntg-S1JyHĶm|O[󴜜&m>߯ ~veMtF~rf&r\sܤ8LuRq-4WW2r\g踎ty485ϳ߆Wf{XE,~r׿?Q;59r8u⃒Rod2z^ Ge b ;7_& E }H8!ϢgE蟝] %Brѣp:tAu>u#x*Rzոهh,١njӿOOe4Ek;xSk%!`{sD,./>{"Kl 69s(iKGt@ 4m1.ǥl]"cӟ`Uhx&h.IJbv,&((] 8_>ۄg _di=j~;#ͩzJPNOhT?*943ރT]y^GDAzV)c`.,H]hRf"f#q CPaОY*3ou)O_M;:)KMY; 5{9%^U"iU+ ނ2aF,4I,Hu2Jv,d)nuBnO[h/1 xK0moL>Jy ;eվP#7[9/;ZC0;C8dbS33 t,2J>nɞ/óߔ/BI7zb$hA^G$-/"-h#ιbx}k $﷣ kzl@7F}{?˥~\jKIEPލG P4k3#a@I\yePw̱s2 c*5AM=flEVXr !c醮`Q`kR1GIOsneZGEdQ0A[R.k%Gƶmk͹݂Η,gܵ[<؀0 dG=`g|t ˝4)Gw~޶>#g4`+$T΂#"ᷠ6w@-`Gf[lĊ1`edZ!HA) %BS  8}?lelxf{?s|}| scc y'DCRE\Vb AH,7yf4sL x\tۊ3qޔ4& [拯<m N⦆m .]v6CPLs9WC85?ٙ"Aa*,0Wٰi73^O={U6Е26:|*\ x"_{VznJyp'%b99^zU}T"UGux\?upy/fԈY<{帘T d'~\*S[di'oCEG lySgMP{q;3e\(y.:x &7),M2&:2¬# ڳJr-c13Nmб;&xt(݆B8?pR3Ey“&zGlf}Fg78q:<Ncʠ1L,?m~Ltһ[4mscg-Hk 9Zr;AfIk /VU)4ubטU)" d,J]F_1JV}̉'d_-/CfU'Tq'pз:~?eXL>^V&kA_gAݖ톲kn*E[Ϲ:{LijO6P1žCHǾtDzMY3ߔ)9}TMçgXqcܨ315e&z.F3+fzG;~Sy#Hu7 ً' 0,1jC>ʟ/D VUrCѧ[pwExwi?.'[u9?4gmxQ;i^6P߯]IFBTyEJ#68R$#Z{1[,} *ۘ)2b ÖU)qn؎ȅ<L%X`T9F$㑅H>HԔn50+Cʘ0ra7fhߠY(en[bq"?h7# 9ZYK0X`" @GMl MX5u[mݖ䙋Kx5/TfC:)\xW.L#?:ke* X58gsL1\ S R.I%kg;{D Hkf  I9*ǎ (,XeZޕ6m$/܇QMjsl7Q攸e'=xIIKD;pm {֘IDk1p2ͭstF΁FG19fr؆Wf܌?쯦V縟.ZKNtV],tTg6f>rĎ8hU|yyZ6kaYvyт 3*A8&wR]"'^ԙH0 KX{JqNl0S[2HFǤ"9`LEYJb,A"A$ɏNJ ESJ5)GVz[!G=l(KTL3"Ì`o+ *caR^Y?eښ$@b(D%BTyOv#)E!e4V60^]%.p+Ӂ YDLIE`,y%<Zsk$Մm[Ĉ/6!ףv ~~-~^zNcpfÑ] Dl^$_ZEM:0 Ic{%a  >'8+kW=ѩxSw6,ǙWο𧻓}WW߼]e6~0 ?= ?bh0bhCs嬛?d\[ƽ+>jmK ڡ[Qz}?_?t:H-]sU-3+@6u]ob8\EE׷v>wʇ!.sU;ZnmG6Niv;=y1q;I Ӡ8>Czvh1T{)Tq8`Sto'=aJz~?F6;繎:"F5ykA:bCLIB#2é̤ ` &=u:9י|@mxb׀4oe]^{sQ;C֨[NQ;[`U,; k:Ŵ/Olɘ_uQjz Љk3#A"I\yGGzQ9&) =#VDSd*Cle")N .6Ts0Ϥ֑aY X+0"V_0Q$t;#g?N ohr2|vd4Sˑo:>MG`U\&!3AL"#RN~|F(XY04\`D$4=|H5r/3{{|1`eɑbyL4B*IP 8"DCД(0A&bN_aa"{&CD3X{))qbi1g;J'bFXaN *޵ds.iP(Muyכ9BAG|(v##|7dpEeπ %,%zGߩ.R;DTQ3FXnùE 93Qgѕm~Pb?͟toڢ3ոȘJ}Q1bBJ&<`j-,&܁F%R<]lםfi4ZKVQ6K>@7)LYICײkx Iq;K*q?F(,Y:6ʼEv^m');~Y:IYJ5ŒLTޜ]/}4*7jz夜\U-*YkOV} c?&kEVu;X|veL)(g }vƳ" /l9?$7YPOYQU2scx "I8 zݪpA(F'vcϋSh / @QX'+_^Z~#D]1-~Wa~I.|>ً^bKU9Y̙C1L\:j7l8:˾O*>yyw+,3W"@A"ڒ&o1(5ֲ RNJҥ~I5/u1xXaZq9^]PEVc}go$kBI,*[}^!eM3mEvZQ("iMu7gJqFg 1نr{J72 Ϋ4q"HuKt%y7r7CB51ӫrPcG< }if]-8<S^ZOu|^ o+랦E!zdt=wEzЬst;D'LEjZgohV{Wٱ3hަǴ7Lk6QӪd|̇-fn1kDD޷s_Ś>[*$!Ɓ%$+l )c9k N ]ݸڧGS?: (NɉY*NT6#A]:"҈Ab :z׉Ke~T >;{%ܼd!.Y1@9B AJ]jdbR!Db:Q"u;˝^il{abX9wL+*C%SST T`1+2?Vj9",:%,GMF&/hmP*p,i@DuaP2z J5 #-EGb.ZLNj}NF}P [͚(䢍fٵwjh+~EQhG!hpzng2.&nEm' D\ŭ((2*S{71dz `z}MJi*(%X9#c9R IƶX c!pqЋ̌{B* j7O>GW>J-W`E 3kPnØIi\BgYH+nx Ys=,&W:(ᕈ٨vĄaU HЦnD܎~5s_Ptlڢ0jIhrm[1P@R"TH>P@2G:%'$S^%aJ5ddT Ԉ5HP #bp>&I@+Śs;J*: b1eaD"IFF%\Zyx,Qk),yM8sDI4h %Di1NmRKg|PijLh4!Fb͹Cuy*>:Iɶ pqō۵)gQJ=9hh9R4g\$"wŤc[<4l;[#v5VkƉO:7Npy? ]J"+q]6jiޛiGH @$Piq{Iɾ9BR?[#dXd $G0x`r&D$RxT[P0W_W#G0+~yz޵Cx1*<%M8E.:$P,ꁐKQ+s2p {^6Ӌ.رPdu_d~I)KNVJԹYot']"ib4+'TsD?|krҔs/Vo`|TM[ԊD&M=o7В>$?tW$P|,S5܄,@x_m:Rc^}ZMk)cY]vћPK_u:f;ZfY:#~gq6ko²J_[J(w*ٕL}6Q>OfME'ƬZ&jw~Wq؁Z@ja`Z\SHq *AF^Kks5utq<ȶw.mG}XzrC:^k]̸eŪE{w#γ\9ҋ5t]AwW|]^?WG)e3߸3I3MzIS H ,A5x<АCBQ(B;|@J͍](smε.f_Kޫ-:tSbmAC<ZM$=QFڜF6ABЄ& `BúΚIȚIÚB ,T YO0!QیpQ:6)_I묢ot=3U{lE:4,aźKx| & & NgdL)<ƣٗ%gC;'=gV08ϡ qpe˽҆&Gq,g ʈKVS&+!VV֋sp؇߫cs9{E>U$m?*QB'%P*Q Q82XP 0yL%{J۴=<ޞS^b ol|4wK(g23/ԘNΘ*A׷Mw[hWFU[TŨ !3@Su)u"S&fHor'7I 6Di1W~~\x!@:PF/#,_=Nr&vk%+;/퉷NΝNf={gW.y׵Ґ ;م+2Y]fU"ZgYvҍמ5Y[Of9OVbze}WQ.=_QNuf0{?zna2epߛ<-PnC̑dqߺ-LV/k5{ӕLgpqM]^&-k6҉t7k&w睘HHVl[[v9܈vn%eY|\W$PK6{jS*E"+!A'$֨*Se=ב_NM/xv-eԌgEM5=3w8~wOnD^2g+9T?_zlyM=K[Ezͦ։-om +@1*7,mڍ_`VqS~k҉`[& zvuuⱬ_Ɑ_Ѭ_k#{rqlN\D.<NKI iNw&}EJ>Yv֯W0܁+|I ^7/y"Åb8p6Gݯnu6cڛ % JwM'CsNaywܱ{T*$3QEgP$$e֜) TQ! .J 5P21CGP^`bpH\@Y"1Z(OJ401,֜;:n}[&7)OxcWTe~"6r.!DYtJXʉ0*M"3_6۠%ʲ<(f1{. Q3f8G*#5c:)Dh_嚇Xf 'iƞQ_1βRy NBPж#$&DҥO#JYcA?,ǢqǮZ kmhv`qD-"FDtLٔ'iD0G*% I'0 wZP@ !@&D%L{u^.$G-YF,Z2+cшcW( kDjV#xK\QO21\r9St$D1*ʒSs!wxAAC\FSBpjT,e=*SђfU|/Y#5C8?>b\^Tlbjwt!DpPk cQ!2ix8bO'^,E].n*lCRPAFn |O8igFQ;#(JE/ K*$QhD I$0*Z7?$GGnNH;VO9xiJ 8FGs*0Ѻ9vs7X~XZtt޾rȵ53uf I׊dO/6ԕ iۮtټW!l.6nz|pZ\`M{M~ 1xNpr_ z| Ά&֜ 7͚eڦP]y/}Zls %eخr̤N2T]ؐڦjo*mF'*8}` R[_}k})BEG7:TX i=R~Sٻڎ=7J$P1fJO*W9P" &CY*퉼(׃,ΝbScP>IԃUY+Zl 4#窿T5^Yw|vTh#aPJa O1\s#SM.:䉳s7)nE^h[LmMgA +`.56TXsWTx%gљڀ9#}ErRʾ3O0mw3z(WV:ϼ&}C hpo D4^6ҕ`\h&6F NpgՎseL˛CTOS0$_(FΖ6\+\Y,SEhpǃ_a0.E[EV[-+\tMn3d$P ŋA=T'F4c!H:ey 8|R/:^lϲCv3%#SJAi4s .MB7ZQP"EA, LGAcPȸb?S zKXc 6oI\PC`C-P ꢌZk45eGau߷KC-M[e)(1B. UphIG%S&VqN,m .4]+HeiQ;%8­G%dT;惖$ok1YEŠQ&Ė1bh,=mwN piюB{߹筭!N:kbSE$tY1?a<>~,=;7No5{. /佽Geq;͡{:b`}Ƹwj4 aPN鐜~Ps]'/OcT{$rVYlpx/.'W1'7\tZy؟d)N?~w>֊d:K8g>{v6*H[q*IMzSSeCCӆ͇6\ZqoT9wnity޼U/8'*M¯ ^?+6f]s [t-XQO'|~E!%6h~W_^ëk׳dG>lY5*fЕu?t+H6em-?mwRܬ|2*D# nxOOx!kmh$8h4%r;I1K\Hҩ܄8F T4RN\pMk'=LU؎:v&`sIԲHWfzKA唕#'l/s":8K6bX犥T`E@oݹ>}~0 iES sT<%ǵ҂6eH^ƐLHp&ZЈAM4x ¢U6RbcIdxtE,|r/'>;b GOS"l5-|Tf W晅x/?+CPYK 7&h7A9/xR4֙B!㦢  1kd4Jd!ROIy6it*Ř)rY . Vg}r3_U{OVڤjMvEkY>e jU\5n5x[!s%4K*`&V:Qqo .\,!&LHq&4c2!Q*DJ$D%\ x撒IyWd K&YT'TkhG*!kqP CSMZFb<8V-4z s~sǍez1&5=4}v/--ca= `EG«RiI"~5X( :cneJcکKl&i`ΠM)` g4\ǃ^ j.MKU9Ta{}w`2RtDD'#;9"7 >Cg;⧩0%ulk̀Tr9.~6z3<~vL ~gr5 t^دWM}'luٻ7%8N;oPW9S7GMĿϷo~7}{$rEVBS9LUgSQΣkU}p/kb}[=^Z9cm)d%SJ675@{:XSo"DƷ7y:ɛPoܪsւ6_(tkh^c>tmJ1]qChBKgD Sn2"S,wBgWk&8 _ș^̓Iga IR PqBܰ*{s 1rNC9k *kb9xz`8)Uw}4O޾:yݼp0W]N^ى{3.8{&\/; *R(% &收H3&"d90TLFޓ=կQm{1^?{suT4͡J;o% q@}|.r9[MYi)9v{mo}zŻ )x(A ‡샄 <XC#w&@()Q P)Іxu㾁`et;Ǿ4Om|uXLӶ` ߞ;[_yPAтmYKIqeQ닋%JXFli^%rֱ$'Lu 5;ԉg/@jd}_S3FsgMSaR{Zv~RBwˆOp4znl*SIIψO9?LIExb}_:&'ځWc_r?j}f+6[<QTjq k{ww\9EDTPh$;[D"q :E?ҸZ}tNGTYp`& =8& hd`gK޻dׄjBPhc"1MCcpoRHY"!(O'^#g˸ڵ*9 &22eUj=WFyj8  +PN\P)4iO,*H]5C:*QQ$oA_!Z橏PHk &MpJ j1r6 j0Q[NRs>ݜ|=1'ys1~].f4yB s15O^3bʩHu>yxAaekr4dcFs6T^3A%6`d1tӈRln &hbܱֆZZ"ص(FgZҸ/{lkXde 8N`7d`9VFIdclyXj,*VLK9M*d cQICɕWu0K]FLh"P7ApL~s>fY V۩Z#g>lRi/cш}5lY#N#vq;V{rMIYǁ{&r, BJx !44vs`Li$ۘ5\>L*)$!%Q Co;WK[ֈjOzqZ/|q}nY/N/vzqqaF(GpDl;MO@li:K,s6&buw\DяVo?qmjznj6jCm̸@Hzgc 5 dS"FB&<Q8#ٚF$d 炵ʙ}\KbV~0Jz#W`P`d=dz߲]߬s1~|x1<%M98%.:+RْJ2;'Y"ZbИ 1NE%"n^Ip%'=HL1 m_skBB]i \/@tarюk˻;$bEUӸH_=-98IWWjl`/FP12LF{0f!'BZH8),Of6:JnZj4.ԥ4@6E y D^Oh îRu#pVx/P998(TJl/Ń9+ >E"mp&wkgGAwJ{;΄yF0XeMB}:|zѼ_}lvۘrFnSmH"8N4]%< .KQD<.SHZˊ3"'cgd|uP*e0gkq0%[3 2&Y2<2bPFO_9tJRT"-r@ms|L\L9;/|J?K`Ѽƈe4@.[9|]1^}"*4;@J$&ZL(#eZ*j}mAtJm Z1C{wPHǏ o`ߏ63B H/BN<(e-h#vۿ9~S鿾{oޜ7ͷ/i,ra &ᇇ @ڶ54ZchۜbW9 %6m?FxW}a? %eْԣ\TA0Vq"IGXtPT %vpbClB_zEް1HgIx$wn'i.tID3E$H AP^g2 cBZIP%@K!KTyu#=bx繎63λșx%9+sRRHOFr6P{|1~W{}yCjBDž3zWœn|# [@QHj)9{al޲wuGZ7 lcL 3Ғf#inCX,st"V]Rxҿ\q{dwWlZbVzkw1;٫U}qCF@<&Iּȕu樅37*xgUY. t&YI\ @kePD/Zȵ,[Y_yAٳ.kDw4DBnKyq_|WDBSަDw5E;vveWiL1& 2Ǒ^i(!eN8\ @c*Ĉ:e!@h\CtHGy$D e9蘮M?BA˺qBvO56ioT^tVuS7xvHE8V0A 9Ҡ\aDǴRr}A$U'}!lwznNp961LG#@f+Qklݹ!M阠2T^Ǡ#)xlpߜ8|{s)qfysFMm 9c\XRy ),M#9 =x%O!g*4Bz2GD~/dLm-tބI -Q&̲JxfmwViC} 8zZεSBlݽ5Zlf@= `1XBJ(srd88"7of63iNۓ#L~DBuhAfmvYmˮwH -6xE]ЃzZ;t5P6k%wF6i}B(9ɘLP\wjj^ʬw:KLB0TZLoPv=Ppm6)SIN5ûӕh-+ƯjV}XD7%hU.&M0e̞0rQT5+.4I^-˗","]I8)ٳOl~}<9wӯ~/ݸHF/>\MVOl|_}EBj4nb@KP&ՙO锸?>e{\d''3??'aIXs F4Nh6 AM8 czJ" Iukӳzt{V wyW~J6s KtkƲOEiv63H0w`? OYd<=~%ZR"F,TuI^{RP>9:}=k뷘;??*ׯg,N&5 sqh󠃘Le)e~:9;:~5tGOd~o_ Їcm֣O +ٟO+WG^xاfQfw_pvX/A^>ٺ>渭9W=CcCog?-/f+?ųY,^uQb *y ݺ ;rLwo8w뉷U=tQ܏ӳ=nOnP.Nf+|\հe[$6"]|)qCW׬;k/N:NTMTu?sN N|QiϞ*:!!=.VguH3)_>E,uNk!tybOZ %Y0>2XPOJ7`x椒ِ6<_3|3muP#%:ou?0+cjy硏GO/bDi-G˗kE|p2;AKCa.;#- B>Gkqw7a{!Gg˼p>\ :˃_"o=fc2N޷Q9:-ӵo9YΘ5vqAOd( l|+gO_Z~P?7i}J=m1Kplz]qwvhθswe'yk/|d@dS|W}7 הM{^/[uiσ6Lۍbٚ,\I!'Ԓx_ N$ֶB_VX/˺^ Jos /OtĶ/o;D{"&}ܒQ8Zײ/*:`oEk%4ᄱ0nRu3I^Oإ|^.]?إǘ>X ]!Je Tzr2dy0 H?/2>hOQP"D.󬝐-TK\$?"bJhYknb*PZ %l"|B($nRtKHEdbk͘ iinVk2Mr jS̹"B/E/hZk{~( *ӿ؄[*}>bA3hgb_n:tJgO͋aj/zc0^ F2/V~|?7tEߌ_ a>@cY#{æìV5HJc?ls%s8?D ׽f P l]EK5]7Cח^|׮g_Wo6zfo>!Hx=(oK«kz$#݆NTWqoPxn{%WR^?̏Y_z!.6ԏѸx<_-Nwv6xm/!ZZ1^p[<hp?[CW<Ƹ/M=Ywx]],{j,W܌uKqMVG=Q|B}PT0/H.1=ed>MA@׼Yq(~m̔7 $T^,N%jT7#2UPK*Ȣ)hKAVfkQ,#i|O^ϣ?}Zz&Y{kO%9_G'}cV(A6[%)DC8OV'&(e1v^%RLZJ$PL$bm(bJEBWED10vҏu4H@;{W|:x)vJk]Vx'@Z FR R%&[b0ɺ쨵ZO!#Ň}1_j͘\"R~0`3&KC HTR*D sEpx}M(`2cRV›16o(eZI@tZꬳ>bgg0pfHu^_-2dYBRBTC-†i@TwLaafm| zr@cV5Zžԣ5Y`u^"pC D qrw>f!e+uH(R4֧Cyx\.Vk Ϫ,iJ̱6gD$GY0ڢQY ׂLQ 30^7}p66Z備XaSFTCJq֥(𥆇 T[2IeSq%G|VҤy1S(.zчZ&l')Vȕ̍1K‹eXgM!(dGoU7W*KI!oD:/q*Иh4"j[0݊ NV` <d>jv8F MHTGj1@yUc-+0 `GpUʣ 㐔YlXJKIx}V}ƕ]b-@P:Z*0Bju)m'-Mn wUTbq*teZ4"h7#Z0VGF꫐Q6J@ ( _5eUE$Qc%yHO"z v2sx&Y4r3x9+1h aͷ#)bb"I' sGP.ޘv9@gF{]` 5Vr ֐(4i_d ϠdDC4GRft_4\I{V K"!=7/:U*T'R &b ]^vOJ2"F_\lo[B#̼T} ?թ!ѿ:hIG!BX,Oo$D;owiXs6 |[4qȀgzL i_[[Ebt@r"`#BT֡:;tD1!`}fΰmgӕBO[.ElɘPB$h˾mD-$ >:Fjpi:U^(E'H-GB@˥wS@yE nX@ƂB'$RDN+We2G'kLƍc4m=GDC%YDrx$ ې, h`unR$ 8U٫૘wFTܶM5xYͺx*on$};^|6ٕ;f3)f4 `RRJג6<+ A/Gd⢯7Fu& 3%j) `?A.j؃X0"m*p(X0'a'D %-@ Dt6@k.OXWH'Ytg& ֨II0;̀ Rtz+"ᨍ h4ߤtAAS <<ɞ%ta2S6ѦqcƁn}yy4Ti{a3h"qfQD4#`((6t # ۷{GSp(r2HuTtk̪k>Hr9)E. `L Lyuga·fU(5p7iȀ5AĐ 9P.3 HI'%\N AiP`@H5P )RnC1րM:6=l '+ IJ]!&5xrUpՎ9 Rexy apaco((cI:Um: Y7g11xsPuJc-й'`ҡH*q zJLU,QMB#ҀcM B'"$ e?0ur `2 B$jd} R@kI"O[i՗W0c t  Wqڀi#Ǎ.L:JVyaeM3 _XU/$ 0!"Q rR9N곱Ĕ{\C7 sȎI8^Ш3.#I!C(Mq%G,X2輩G C@t!"`L0@5[FOb/nZ,vq^[y4,ԬFEHW[Άf.Tk2 \ۤ<'wTz߭ cnAk5֝}ןɰvg[nn./*C `QeB[_&,S} Y/5SMtƕh&E=\0Mw|aV˛~r2mL9G ^hBq}'-`ӞWͺn7̳5Bb" նZ7:nCl+$rbxn[sjOӂ¥EzʥdmLΖ?(M8v%p29Y,Q |K)my2&i[h.]`e3УG,uF@R`qm 1)Nqh"&2BR)*jdv9>2l !:lAF аl5Эb2N DPύGm̆7Kir<BGRO@Mԃ~7ڏʶ\M@ 3$KٓeC2CAQcPfLvic᤺}wW5kB?6DqW [C-gYRh&&VQBd 11D"4E[#,8ZIF ,5k^e,|Ye1w$`gfMM5N }Hy!HO7*.O|(6}eȦo}|{3R\5 VZ f7^Lo|4nZ<|0\Lj3۬{lc}n>M L{ϳ^I/ZR&۶:h#{!aaSʵ.M%J֯M\%7ih 5(V8d>:\//lޯhrhv)Kacyu;&Qo<>k)o问Mo]Zx? LlM# ~'K],S6#.G4AV:U'huX;xc)XT as6jᨥg\^R8Yaat Z* *L oCDbOF DbRed*Z:)eQ!4 &MDj}!oBz9rMg7Q~ʌn0|lFWB';uovyM@ިj<26$Zo% 49n" q]94 m:Aէ0 ń6'(AZ7O˶a;zG6ۍ~nͦChYB˩{۶wyݝO.xB0 7}=3~.4)0{N:ѱ7vP-T}z N.Ҝ.t{fM+f/}܎ \ptI}\W=W Q>@Iנ?{M\ʤV0#*ے-Jm|,Sj#ɣMܺOHo}2kdRշT}k}%BE~tp^iILLmfZwb;hp zzt˭pg6@{4a^ȄQ y;A`q+7in!?U3S1ۆZK .fޖF\2kT aeΠF|.']}\kcoN4ss=i͏fF}'dʚP L,LRc$Njs6L̎<@s9efOf p5ctُ"e'9-2J!bvD8@J7"kvH$q{.WǂJ;lKAIi^ܣw~|̼@vjh'4ۉۭy2nl^5o :/4֨HگصFE&TqKW1ҍh]Jwɶ*W Ϻ#S.vFA1Ag4'hzY\$ƍhlu^ lb3amfFʣl3hR,ypZ0Μxj+^f΅rr/ttb0=xzȺp/kv< (Y ɫZ;ӦSQV }G.Cm|Ծ3hw_};C] rB1^l9B;}@|;`뾻7oTՃQ \qhKh Zp +N9tƻM( %Oepm]moeڎalUI>&6hAͮ>R7}J>TQ ׊yF)W2Tq}->xw)l>4=+{EYIϕR`mJ6>~-)Q@نcף3K0-fT{QϱCtHP! : &h k/B)U /:|P]KZ|v1zWl[)`/$Hd 3âҌ;$8"` (O })ALu&A1Ŭ é ӌȩ"- "ہ6(d_/;RM &}-Y-HT"D$FI)2 ) mG)/#S,g­dL *$02y%D` ,y%<Zsk$E;S6_< {1F}b \wUr >Qa@I5Cj?Osj> 5{2=\)1S\ 9ҩxv֫Dž oo~o__ͻ^ uuNj?;؁qi ?z {#&2]uaTCssՓ 2.Ϲ 㮲ZPAs30w_W^M,_uw1ך ~ l~۾MՅtO˳5m#`DU|j#GؓV?N4(DSzvh1T{)Tq޵6c"njAv2A:oִ,y$==>U[%˔,롻*EV|@Pg7I68!iS4xr2}S396ʼ)C M^Eo:zMZj8שrS)LQmMUdj! tē<*ޛt}V Q+p"lJiQ6+d7R8d^eǥ6tDC~ǭ 턎 n&o,cV:PWT\JvfUtrvUjV SID` 4iل \#.cpl SC։V ]hUS!EoI²7EDm֎y9jNP,w<\e,gW NlCv;y S㯻@}٨sdW9d4h:8Dқd.院"ĵa2[VS*C Jpt1tvѰ|wX̷h#=ùF \m8%Qćc߹7+_CRl>StA|%R2k-7[Yno~UaySAȉfDC#ˍI5"zG/nȃ y R yrA{=C$THrF[5C-3HH2d4c)d B&jF^EJ@z6xlbw TtP%I/U i/vpvH)Q=n}cޘ=7O+wUJfwl; [ʧi]ӝl_Uʙ|5MI-n@i~.jT )G*' =0@M~ipqR[R`%fN$0(FɘL*ρe*U%.Q8_B s(AD-E,1H!Í6L LU g;U%n ;9(j0 sl3Ka.e\%|%fx҅8-4oo3ik>>!Kc=/ڳÃؒ89f21{ %M~jŧovnGt^r`m4}[oQi_9G+WQHJ:t!wtZj^'%(UwP<(,S(v*@=~'dΆ\%!oy̸/y‰-V&_ٝ2O[ -8u"}Ɨ6,MA.fh yY>%e%ǜԌG҃VޕMWis=7l9wk{'~;@_jf-Öcw.eo{]odaJ/7 e[K R!*wZ, ?cۥZ?,f}xך?kn" tnoJՍD ADog[#S1p+R#%,v0~N{prN{/ (P΂oԷ jr6zOH3!X7 Sd(&pDt֡R2"%C)go[d6c-h=^-힎4n=\VG{GbuD;ۅzj'qS:Bx/@;F -!ǸYg@Pj0` Ȅ0њN)؈4D Oʬ0,%2Yz^v i\ӸiHa@~sGWKJ1x[\ZOw:]mV78Cd%lO!pYιx\4оuY4Y-<" =;A2$ä>+N%K&NH)k:IT@U 㤬{I8{exR%FRq̞T掠^ĭ1XcClzR7kBޥ=2+A 6gښ8l 5,ėjxU]ЅfQT-;RѡqAJ0̰#%R]$)Y/47ٱ0NgXFd?p>}8O쬫 3G_Z;}wAGWÅUxpFyu%[\Gsq]쳟Cv5݃3_=z5]@~<'/}ÕW'W>oB :`v6/b1>_|XUۨ`էՒX/ΥEq$1xуW3e/?wG= Fw@ˋ!.='^<嶷_gpVjI& ;MJ73\uYmzvm@+1Juw12deo6O qY,vZ:^e~pЭCy?qG?2g?ӈL ˏ&yٟ0EmY|Y,eZۓ^k\O:lσY8V*&<1ʹz{l6zP_*K !T%#<bSzvl'\^ʗVҡ#kPL4s&HRzYd$ӈP$+sLwy݆cIӢ|ȣɬits%+[ )ht[^}n64L{XcGU+y'V'V}.[>$f׿'!f)"3rs匷Kz!vKeKcL?PWzRrzgeYmOy#'*uQ^FiR&wБ^^'\;/(Y9}o5QZ_;]nu#o{jD[QqYkx6r4Jx[eŒ{ :ŗE\Nkdv3/CR^8@fѠ *9W'D˹EM0ْrY1x]_baJz E3&jLdt;frtcF}Qk8;̦iC<Ԯ:EmY}jk2P9TBf3ҪwGd.!B&2ǒVKu(M]dvC3\doa&ْlr 2jMKBVN G~EVS`\csFGc|#ՑiC5=zN?ݵ:kt4I%7 ꛥ#HN`,12K0uHH@a-`E1=e [쎮Sx f&r+iUD]>5 G. Wk yպ2wU^k]vjۖF׍(r;?&-5.W<׋5;ѱ7b2^p9pY'lk6on|GqFS{R$z]P:ŏxS4uqc:͙"<aLf7UdzI:s(Kr,ɝʒ7P[>FrЮ% <y={G{ ro.JFa#Lyt0`X:}Jcf+X0ugsy(ȏ}({L,c0P&p TQ͍QmQ'(߮I9;9)ϻsxM'2aWW&5i7䟹v!hʝ06gD\N˩3ڦLv9PD?.vt]"QDtI\Bq#PD!4YR mdc) HѹSDM Ry,0,N;=( 9A N9{J/S4=n/ [z#| ,*`Ft4}> >^UBA#hI(Á@+BQl", 96טt`ulta IA!9""O3B\Sb@/rRj͜'ő;F+ &$؎Hz q6e=)WKM)pZFܣk R5]'"팜=}Kmz4`sW4Q2t9)d1sm>Ė8~=Gi˶hp,' 3*AءTR]"'^ԙH0 KX{JqNw0 /)|6aEwD@4:&s\J3XDI$sqwǒ P.mKXS >B z8PfENWT*O=6(d|OGcM{ 1LQJD]&FR Chl`B[둳 u+HIG@B ,F*n s8g\܇T=и5:f]4 Yb8򕋁H1[/-yaM"B $1־kܧ( C15."?t54>K mQ>fUhk>EJNc 9eSDNd SŽN9(0yY(:Rak. RF,W CwV'9K\e o4kĀL6_JpMMHU}ŋkprƱ lmj `JfΊTkvv Jac6uV5O5tS^}b~^*8IS/0_8ΫzmW$j2R/w/B-#QZGb.چ!ðaVI`*|2]5 ]\9ηMQ >j3ɶQ1(EFF篓K]/ _1KS|I꿡ǠSP[::37;~8{o?:׻={&ϯ>}Ο`m$0`~_vuWCxT=Y ϸ){S|ʑXC<[Kc뫱eXRz˖S Č}曯 U.anRE>܅/U q? &B_o[mu4q';6G$IL& ER{f +SkP)SNMнFq.G m:1De 26G2éd= ; IdgN;HALya<(k`eΐ`T4FKjԃ>M{HӴQB5 {gPX0hPZ !$ 5 S\_>QL; /k"nNBͭm+,Ujd _tV{HY5Gt`7 0$'B(jp ;`1`-Xh ʌ >1:&mT Pd2Kpg[q7 )S{lW]~lO-6E5ݮ-',U,Akrzom.#glJor -%& {/ceo DXSzVM t/(eh(`/Ja u/v d =!"}z}X{))`CK9 Q,HЫ3rvC/Љ1R}H˖,"/uZ .v70/uē׍nvE5ոC1(8zQbX"3~aRcd`PAd}ZOgv(nU`IG'U"g?fJ%[%>y4d?,&'UJ x&U0+NI_*Nʹ G!܌}kO6ipv^L.ͨm 4g %o 1=}3c ^N(}[F.wz3|3Ax_ł}2(&^lZ6J:tiP U4Z1|nFA ү@\]8>TGw!03 !|aOZnD &+ڤ? eo'%>M ޳Iq;gK (nf驕=[ber%]Va!VCcK.\8MGObJN:4&e͈Ƴ T5if_'@b<"`QW7[7-U1m&_)۲0Pg Eɹzwl\sf0Iran&hխv74L s ocBCE&(H&\m1.;ݺSV DDCs_Ug"5Y*'(UE6BDO=pf"^4` N =-8̋# ^dPGc\ pbD !S&H`%mH".RĈAb :z׉K ?ƽ ?5{x=z96 4?D/%hOh?.N I74.:Mr5͙pD s$QA|G/L41Fb1Pc@Yd(2[3p]r 8M*:ιQjEw9lVJHH!]ٍnACkɟ̵-."p<6 >=b=!&=B1uTt1+OŒ|]3T|{"QWOBpmTvdgf "gi8T؁u͟Ӱ7aoM\7}RtN9MFuOy, Ye}hFp$lZ~ ߙ/f0@X̪I?fvF${ v5Y̖Tʾ//Hi0}@.ޡ)RWˀ^cv=vJrw:`W{dߞt2mכH2{@FuMf |M^S4˝V0#)6gF\P7;˜|Z"Z?Zuic4ʫu>A,+9Y Ǿ-ٗ78Ŝ9s˭=ʼnuXWIF1kW63vmS^YF_&_Ia7nvAR $v|0)Z޹u$i}$53k$/} ӔWd!EѲDJ1.HlNuwu[}yj=imtHۇku,馽],1*Ӄ7Xy4 u{QO ޵q:!x)xDz雽^`|u4:noӫCrz|tyhjha SO['='^GX4n0A~b]/=4qqxz?aXs%?ĐP$;a~4}|Icp B#ޠcV٣9IIZqW قNa7}ҚuH27|kSQًtӾhQs?Z?WqVU؄+HKE̱Ǐ{)l kT ψGgLN5HzXW0],u'o嫫3ޟ4i{oIh̻lN>bD}@ uܻ (c(nŵ_~{Gnq8; w.֒{Ӷъh_|cj+{ޯ {k AMqJvPCiͣn`^ \[yw|-#i:ü3?on?p93=9ɓ>u`7Fv }BB|9joUZK4$m dHU&meY2j[޻~ =G6ǣ>WxUf}7=+$ky;VlN|YWRE&'RF⼶l22 Ib7V ׃KrAHQ%'х2fE,Me\L\uERsؘz;:[BmF;;~Vk)x)';cPO1J:JfԭR :KʎZ:M:r>{ۘ/fL.IiR GE5Sa4:AI(*)V!Zp- wKPd067Cc6m@Uk%AёLYTZN1 늳GpGK83RZF,C됔2vaӮ/ MJÛTwLaaf>MgOb15ZErm&  \Bbu0b.1׏,1KFK[!X/O!Ec}Jw^!8jm?n ҉YUb9c$✨H%9ʂM/r-;u}32 MC`ke#x?EgJaiY1kJK\0S+lʈjH)F"P+Q49KL Teʦ JF|VIb9kOeFM*\ J{+[lڕ̍Q#b& ˨0iJ!B QȎD \.%Ɇa*I%CLxeHnjl D)%[0݊ NVh 笀<6ԠEAh7x(Qj1PyUc-+q1 hty27p,Մ61c1:i4 ]K ՐYuqZAj+Ⲯ>`TN!({ 㐔YlXJKIC*Zex@Ww@AQtU:-EP MilC?!nVor3E ] * +' ʰTlp`Q&T$2 KX+#%*(J#W!l:Ai*-'L+|yXYUD25Vj^P 6sz&YR u9Ce 2oGDa(4vn5sqN:M;‡ q@]1*COpPg<"D1nlGj,!3Ф~ζ+@TT|FLF(J8 Ls$eQcQMڳ$w"=doH_(d[PSJ(HflB*vNeWݓ.ї54f^>ѯT%HH/%i D4pPZMil,@ժoq1v'UP:o3V7_&8M23u&_mb_"B_1k44_#BTP]a:IB1!`}2( ϰmgY^\.Y)9U-UMے11-u"As^X}@Lk*qi<G&@OБJRrt9|h2@]<"Y7nq(hRcQB'!$RFVWe2NOjLƍ%16GDC%!YZ{[{#fH܆`m ]KwMd!*T?y*CVT%gB]V3I|ǍU>}t~ls'#O*q>e[}%0+tm i,{ԥt13ŤuԆHT`Qw@r!'LC]ISPAa,C)9i0 6#*ZN:G!v,p :t kRg]IPcRm>I5H#ZMV78t-A̓pz=EȒ,\\5vƍEQpIL$E}4"$?y ZA ۿPXTDd*jQcQy(? >&@-xDH!(Ɔr")5Z?ce_!gѝk(FJ"42˱ o JR*4[q/TNQc!-ElE ay+Z,A~cȔZLOm0Dƍum+[Ϋ)³6mp\+h ԍC7H7ӌJD3 58)]Qaw-h ~; ,F6ZRQGբFCo bL Ly0[N*3B*j7LJȀ5ɡM!dsC<ػܠvzI,:32T"2 % `P B<^rc i^/M6D?f`+BJ1$N B0}o:"w^^0b zYQc)U2z125g11xsRuJc-й'`ҥKT.q zPc@sjg6j1sq@Xf=P(-Ag/ULd&Us'kQ?!\+6jy~5(JS>`j7L1JH؝4z X…I'X)*ratAia/ZDXU/$ 0!"U2'IkO^Q!,6T*`$I!*ukJCqcI}6FJR`eSFvL!jf|IB q{kLWd7eOJ< _bԡ!e`gPm7ZF EϿ2י7mF]_Gy챓x"{MCfUImFYQ5eDQLx?o\AS¨V5#-,8C콂 }#B5':7M8{9JfȕMzt<ރ/|W/P\kI~9/\xU|uV}or~G-#y};?~@:?  QLiua~r9;w{ëC_l~amOnlq_v^ŋ7yK<lU3dj'wOrƜZi\SY^b1W/W_ooپ, ͛)xc^ZfޝhAVpϺxkG !\i\J:JΖE:8@^Uj<%]{oG*D#a 88 8aS"i%G9_)IQMrƀ-Uuu+4 x~lcx?S`|,u| VEbbpl05Ǚޣ`-PdԏfܠY4y-eÉ N{6>ZYK0Xt#0BP5;LP "X51]c o3͚䅳sP%7FTJ ,&Pu :]Z35]1`6΄9 9X&L?`HS,{\0xa4]1TqɒdAXhlqfb9hŎd%Da|l HIzbmJM#֜#{D EAe  I9k'|<6Hn6=fc~ЖN6o& '{ƣ7:xo`n/lh& 'Uf̨ͽ>ߣ"=Lfwtw=O>aZy7[I+GC&ꊱI@h{ש2ோ m>P@ZWOa9Nm۠ݾUвEíwmyh&rZnh>fr0fl:qc'Gvdzw5gӟ5®-)VHw-EwFs 5%"A('S=gu lS1Ny-8T#ˊ2s5͈2A\ ؜땃)2%J-Vhs~6<Ė8hwU嵬hnaәwт 3*AO_rT2KH^ P;ux=V+JH7 Crl9 +?V`Q Sb*3#$TJ=:mA)5QO)ml5=ִ äOs jE!'QW"Ð2+zl'Hy-y;Dɣ t CDQ#S.i(KR %<Zsk$+ՄMĈ6!׃9sOCbQݫ|b UF!IM26k܇Ԩ926.|*޿?_*{E:ywIվz+0S0Eb.R"L݅)pcEy6 gE-63!92ὰ\Ɯ\ w+'}0V鼱x7_G4@s>76"iV VoM&zjn>*(gKNFnV0MSŞ6 oa-$L/T9L̩iݫ'Z~|_=x};n>x sbna뵥zK`O*j͵w_3ZFnچ!ða-Q`*|4\ٿOMQ >lmUcP~Q3빼}7ToK_<FNE]ԣS%VmJ(}C~F8]N=ab׬B:1C 7u~sMU߭SE_|W*D8 ] ` Vl6+^v;y1q;I Ӡ8>Czvh1T{)Tq8`MGMН+*]ĺ:1De 2% >`g0.0o:y0;rSzlԟ 3JXWE?rv(P۲Bô 2^ʡē7v2U~j\ur ( [1,D;cإ14T٩!W+x+rT9 뽝:{owg%*AU0W:]tѤ/W)zDˣ^pm5 'agE%z.4m{I}3318~|14A]ා/Xg{_ֺUm2Wok/ ΣA/Eٿ*/ޘŋFIg^?\5,/:߱@ D#oW ./o#|L f]"?ntЦkQcW%LZ UoGLE~gm:)Yl6=Ζ X\$, *4eXlRubcTl4,t~*aD/ T5ibG7@xF.MƊ+z&/7?[*maB~fe[8w}dsYl19Z }9Lg 0mٮn$7?jYn'U ocBCe&H&ohf7V]эh)8G4xA jWp~5qEJS(Q` e +g*ժ"ݥ|:m'"rk 7)L`)MaCDFR8'+Tuz Z{1lft=bg>f#L(5|{iMe-f`}n6nܙ.fv˙bZ7i::m U]nG~$A2R/]M $9k{*1 IYO (5$G]U]UMwUQ^h:?y(`<`J![nu '1be8$ %56_*mEQI&%1JSz(]@4\(wހ&3LԞ{:Tdz4‚S7BMOZ>I媘,Q2_]1zK-oC6&+O&.O`|(IGYv۰ȥ.BX_}~*S?]< &$PLYH/tE>~ŒRI}9ڙ 'H c}9pnagܯKihc[;ֈqd#;A!"ED5AbDra <PQ+ |c ($^2:łKئ΁'MFb5bg䴨}!OΤd[(;֋zqZ9kl "$H" @$%^|/3VN> '1b챴Fkg:FQxlqpN?tӏKh+MrɴB0rU|TW41kσ.gns6WQiמB>AH!1ڷAȭh OXĨt1eF9o#'3 GPemP%}Z%}N=lVzx<d*8Mp2`K45pGg1kiIB\1)/_R.ßcosr]]iKߕQ3z8?)~ڙ"%MQsdE tT  "d ^ߙ[? CQOe(0,S6)\ h]6/K9M=)q5oJb!e h9uF*j%|?)#Y2|QW+4ul٦~UAzT+d?*{=2"]m2czG l7ьTeSkͫ^\.C9Spe\uLHzNPC` )a<ê ; +?K Xϵ(fionξ~_܋&ٟ6͗:f-h9Wi6CNwc'dVR)S$;-V=ܽMPE67Lw[t ~){ ௪3q*e0?*JJiHTC|5K&lgoj@ޱL']5X# +[N˓oGJ23}o&h2(sN;z&jnﶥ^nzUmv뛬se-KL ^+7L>c0. 9OFWi^3vHO0pk69@k3YGH(򹖜N)bxЄC&?,omgLrJ͙"<ůL0&w⬔ gI\sM~,Z/%={b'] !Fvy헝Ӏʪɽ*riz_]^ۃ!fe&AFlE(JAEuhTD9.CD$V&|Ug%_epocou=nF!z-I 9E{5<ɴY7,'A >`j䁳Ё߈Ɍ'[Z4hl) :(ɑ" `,U,=?--jU&ٳAɛ͋gUKyO􀒟%Hz%{o$WgRE Je<ƀ^TG ՘/>FGN"d>w &}Ao]r 1 KHX[d4]3Evc!f`ꐐVzWONKpLV@ut]e+'y{n뽻 z_a2vb^sLTkyB(5wFD}Њ'{wїj.X@7仿ʼ&?t麦f%0CPgq*u AL5oοEjK'..'F7`=q\G㑫~ryFaSd$::(Vr=<7hQ*k y-ҹ@wad!.K "vn7`]:Q緺`vK~.!9U.̤@SZ~ +[pmVeK7TDu͝]u.,ɩCw 'T+`SN$1YdL}7ëk]w0lNZC D]~eM]^?*ޢ]LtBZOa9=ڬ>Yv-shY'mO -z^h^FkL}2ߪ*ŭ9PI 0ҬKE~t} WWg_vL}H$X!º z ]E7}{skCpEhۚpKU8XRf&~to|lUڃy=t\C!fH ^ wE%K6GMA*WYfe澫|KA2O:ކl0L6g.BQ$~C(nԂe7S'\9[/vh`_,0SxP܃*`q`<V[5z"EE޻?~3j;)s ?Za-ݱ|h<L}$ѝk=@RCVƺ疌Yt 9ZYK0X`" @GM,hnDh:vF.x&yA6yqI\.@trH&Z݁A) 6Jt5.Y*( Vf=`,A=SOe+Z}ͮxFvs\Sb@/rRj͜'ő;F+ˈ 9 T*F2\.%6!`BL1jveZyq'vFNKaM(Ll6,\J!zXf;Q?nwVckEG;P lգ3ME-fTcrRU$(X"{DIY PszϞS /:|0+;dTz HitL*pRafXTqXŽ6B >^~, pV'\L G?fm#"XpԃA_bQ95f#$TcDPJ OwLS$&}( n"QU`ޓKkb$0&9{Hnm. @)ADQ#Sb(K,q^<:k7&~km dC;?Ec*pf_$"R$ ~*F4Iu~R(mqeNOn~R\$%yoǫrQ>T.P8})U; A#L9BGRP;2qQ|_  ɊN[l\g`d{a5<ʘdq~(Uݔ&^44kH!MveJE2]%Xٓi&zxryuQN9X t*xZ!WM 6d6QlO kCvkNY<2L僗ד.1 \Y>#|q?֖.H@Ԋ'e!0%q$Wt4 i5(,oa#P K+M Mdj֓li:NP40pT+Y# =*bCd}Yguջߞ>}ۗgzߞaNvz`uN5$c7ӟ6Zw54WMk14']zm]NaܻG-im-7?,إZ7˫t |^̭VQf +@6l*Ni%U_ >|WCl:ȗ|>m4$I(@.N ER{f +$Bܧ c w K`q꒸I.I٫zHʒ,R2Ք(y>ئ9͙zʢJEjIjOz-u~KlϰU/I (cJFH+Bj,24> '(xao}&:˲?q6h6&n5G4vf|s♝כ;wt ⹇V ;/Q/S=!=rFx6!sH|0Eo;seqJJB*:йߝ7ObڀLd(wF8ӥcPb SiDĎ~P_VVmd%B,W1Rjk]ܗW{OVaW xZZ%{=w밚Yn2tڴTe3*.yR5 yp!Bw!O.'v,Ҟ ]9' 4wB`"#L%W HQM h$`8!dOHJcLYKJ[d`AbbẂYwSf]r~8q*XKRNiSZ64짱 z& Uc)oWoב9׿/C D9頱3FG"~'kГ|Ί{'H4 XF*b%/KP"aFӈ)d"dfm1b@U1 !!re#J2&(#zeYPlwI$)"0j ͺi@s`}TtnpQ=57sSӏ;g1>ygSAFRWP0%(xleR*VEpf63[.p3 ϛC[$zabCV@P1Q6槝CCV%W y*]_Kn|#aGӈ<;byFPtՋ2g5fџQpkneGcٴɍxG캆O}5JGSYp5vG?iîկ& 7Ȇ:o8_fll}g!schw|҃q^'zi=q'Xt#ezg>ٝ N~I <+~->ʳ*Pw리ԿS (lwg9e 7ٍ8n\#qZnʽأmmrw݌hDOZ6t;~eX/#+ TX,/s6{_D'ܾhRV#-[ݖrU>16t|ۀft]ظ7gO6.MvvMw4,M!=@(ƈ ,B8IIB@ BcH|6/y0S2)"`)R,O1MŒ]*+J`,G1P2*} 4^/j)\SjT 1A(knG( k9 q?]K=Qxeflꗟ}~9n̰0R s> }0wO M?FKbv3LkŶяe2Ѵ|ͦN fFZ}W_A^_*e#8|+}R:?}fapӹvy#x)r+,C{z_Q7JdowB2E!$Tdo$p 1e'm.xBD@+KSrz "N1HFfyqfXld1 |xǎ`pjWnd24/WoZQdzPZo1VS9(+j{-UnDVOMCu))*lj3*'DL,&:ʅDnDB bW͝a혛.Ú@#!8!)+RVՀQ Ljc <1:YI eHVi⠌JtFyc2$,@Li6ʑT)0@ErY1n֝GB.nvuBQ )3϶b<{tLV浸a|1;<"egC8e!joȤ"%#M6kId٤$uM.5Lrހ1Kem[7-&o|] h2LpLb"!h2PZḵPZK}]fpvVJ]FT 0)Wl9لNU&}qW?{7FQĶHꊻ2^ή`Fy^mzKwމ8ۤGH*HvDm}S"I-,d/l%N-[ |VzB=dA=V-p|Ph\آDN:̯ :|=i^>3??/xrMV` ՖG^獵fNZ+ĕl|qIv1yѽ< ?'7߿ݫ6#Bnjy^m2W߽+ݽf+v+d"3<9̌5H{dYrEYCi@Cý_Γ&a+_?M ǎrOTFrInL!ZOAcRI}s=Q{bsOZңdI'1Akk!8{1H ,Im2d<3hjKaE!Xi@zcxgRbR1 壶; m1Ͷ7Q@[^*xgdtHUAj/ڄ d&D@ް1*:QDy̑)W:}̻dA,U B$FI^pn|C sI"#AHJb  [ѹ/fA=H#R,v +>nE`XF2(!3JiDd["CIb!}R0~?򟐫՛2Mq&б(SFdR6 b6i)f'PȂ[AK!a½0J*<.P.5e^hTEYf%P nhkyT{en`g!ŇlTB%pOSN4IM G" ҷ{%0=jaq et+#t]$yQNdMJpBn(:&8㒉s\7c?3K4Vv'k D@['&Ͽnh$`ƊNgd+ 9jmL%J0?gW~l.9οx ήvƐ+OQgs+ܮHh~96;oh94z $x}Km͈f1ѸQ4e%=hsp<480 vsA6VI-[0a|ʣ&rUvƳ+zJ*Z6.WaprH/~|W~f_?w0ovof`Bl{wޣW_iUޢii`Qt9o.v9vY1C;K_>P^8hۅ$xP@WQףXO$Ǎ\M~+بCn@mK]ْ^1nI++'JC RGf<(1J ٣D8,T^Hϟb#@8rރ`6s'.L̡AxzӉܓŠhcɾVilM<#wy/ zyqVy\y\,KeA=X~M Lggu(POOO]kQ#Y$Qd pg쿌K>` !;Pe4xn9ݞkx>ܺ3vFSC'T Q]>t<էp8ꂛOHʚ,y]SP{/m-sؔe=zJyl(QʳOJ&EʆZ$LБPXrZؐ2V)˜"+LӰ\O}(t'N\V~Ůu3xbΦ : Ljtv~ $8jEvqZVZ3M> 9OPyΰrt 5-ܩ1#Jse[.Ip&Y$}l.l099-uaκs_^ܶl&}J ԃ7gDe uY6՜~mMp" BA9_6)b-9z\Hrr3"…1AXV0%>CYEߨ`v @mvN1c<'tyaFYBb$FOrERT2w]﷫l@89xox.\qpf<Ҹ=6ɧvcf`f;$ ǫOW%#!f2GRm:b) xXb v2E3< [wwWIl"F :r(i ()ԁ@(J"Y:r}&q 1Kmߞ%$&"# N2IK ܓO4 ֻ,HlЪ reÃΙ򅴉acNy9ϒM$ &Ѩ|),38wh3V֛LX l=> OC4@KP& $]Rȁ̓a sBf*C&A=ܝ,J3EǠHCk$"zC I]O<p>>1iY_b>X`^m?|3#}m25$'k/gBӿ؃ϾPJ9qǜtJVz %T$BY).܇;EX\C*:ey,L`6jMR͙s9BA ‰܅VgݹaN<`5x=С u) O?xTG>Ǐ&O|3^Ɋ4<ͦ5W0i~.s"'r v1fgPLL U/h>fchez-N2xbk(v|7O(mgl:xɩu\M({nT(~4FIǣ!=e'= MOъywe0` B3}iT EsUw=U };-_cV8H4oqMξj@fmͯk TSլ-fzv:|S5SgQMՊ"F2;qw3-eW X"s[.Ib#y9+kjwM,,9Πmfପ.כ57śD.2\C`-)>:{(&(zeqhP .P;1rrƘa!ϯto~}Ws7# nS&8GX$`;'02ANBD3C=ٜī1JMw-c87&On>94jdO .cp>}fޯ^GHI`LHKN!(#ϹT0\r^ISu.{ХF(kyI1^1|-'yfRjkiMMKXhd%8ȲJDm{m쵍|^oZ6j}j);Di;n~|ˮGt78Vs6!w{!K)}8By(O7g|:=iU=yq';L0rEq>xR $g1%N!3Vqi&x5zN귐buh".CCu@QמAt#ԅ/<#B=)ԯӗ59fm=CHf<<vSșLTBjj.$ڔҒ=y0چU3!Ӕ)LM C~:nҊ..UJ}ao|:>L#1a8Hi}09?% /r]Mmn\. H\S$lR߯13HJCR$(Q6e`f@F?LULϓ7K*$b, ꖆK7AC-A 2kqM.^C{۶=ZGLtCҽ!y|gx.Zg[= ӟޭ1i&fZ?"wEs}aGNYwʂA(8yhv,mNZWjąG/^p*C@~ϟtJLIS]hPDtGճ'(F%IU +OJ 'UT%Trw囨 H |b/ip%ZsԔ2<__]nؾ;3$ 荕! !E&s\ 02@|2o;-!ԖYGe c+H y@ܵՅOF]km~QIEȅLB%ŬD^*]}(ΘJ"+R\S^S*M4X\r{ DZ{Ga&>|LT:TIʹcEJz1u$TĖHO[ DHF!2/AdA$&9cgʒyTO(r6^E =:á]IkzBIkn3JNXdp,FYKNJYe Qp"L NE`%TIn2{3lug kY|7r-UH7siG7Rq;D8Dխ?y]RP['=Ad/ϳI 3_ ruݲ;a E?AgzqgۦI%"UAK'U8Z&WA'γ$U ^׉~~'~%M`$m 6pykE#]JRKs*8Hͯ^BqKk;8ߓ&J} ,xy=ͯA}ݠ>ؠiMeF#ٻ&'Cr:aIFkҘIc%tGZCj f@ 3E(GXB]!dUI3*@RPbT21CG#xAp\DN'.yg+2s>&hw* 6FsQq:P˹HSRNQ`4Pm*Wed=,Qki5\*=^Gh"h E%5@8IiA QEE5.ВfBs9D,Ff2T{ycu].,I.ZC WH)SK#E!c!AsE [XNrrXa1U& 6do2:r7EllMEE)y%̖O;Rc NJ Z)!Be͸@ "PE/C%PVM}ehO GIɱBRz<;R>V)Y@)>2i 炵ʙ>I( |>gnŲS*6QE==Ì?Qt](L*y)_ '0.xJ *@+PAeIX-'=wt$cDKyTF @ N&Y@$hĒg(%夡ޥ䍣#z? NA,f$Rz\lP*VxFy Js9~^Эw_@([ ~_Ղy4Izw~=O'a2lb9b ~Ty먩R Ie M^s'. W܋A?vXūѰwgl1 d *|| 8ٽjҧػ]73NtŤ< Ye@T_@5)jH2џSQpdE(a甬:g"DWWtqBkgMFםC1?fc]ne҃K6v%;2ߊ-]nȵMGtgjYXGm!-Ty*yn]5 ߮zh+qlnx=XKd-?LF5ϛ_]VGg n݂)zXsy߶]}jfOg 0'O3yڲDrj]!']:(MʽڢR`\''W}nz) %?XSG` |Y6L.)]*s_ٞ3!-aӜ<~2HCcg+T¸PT֕lK٩3D3owaK8yX8/r\,bhhF e`K&.;#_Jda-`\(Gu.&*!Y  yMwGg?,I^8;9)M&9ОL& ҄g\jI]lpvQoG_ng6aa$x8뽍 lsa N=^B[OPN(vMo!37֔ZsS -4z|U×75u{]}xvz-L_5gkd8G~ G?l^O\MKY[؆< d.<Yx_Vt-q"ȫC*j&pVhyF㧼zځ5U 25D_>~~qpߎP|3i_O^R>(ub _Z)De1K:mz؋k-) q%{6Y9*6]6$6" |7+W*2hC614V >̥Mk91Jm\{qܛ .آtæ&}ڋ(م@$%vnt6,LZq& ~ܵd&ܿg7vbo6Ye%ʋ܊KKzM=u,:0uZaVp'dL=jZuwP'[(\(H\?0{aX8(vw~3ש)7Lf;_Ԣ~]eFDŽI&8Ι4)R4I Bq#\ѠdRQ/S7m\kG-P;)*IKͧmɪ?eNp%G!|&ʲH['9mxNp%uDrAeFAZ#[ok˼ѼR8'6&{`8T$M@_Q5T2<Dh\U;HlNh}V2~tg0PB.DA aè#+vWn1r .P$ǞFam=rG)ix/lPom.*e$м$$\LZJrDT%Q&[CR(䒥1Br6:}ɨ5%aT&]gU"8"d 8\Q3Tǚ1\ᄤ@\MKɴ%%jR4B#!6Ϲ "VS<+ۍȲRKd?hd{lXł*6hI ωKrMYuejɼqs#✲ٳEq A"cF1̶>rƫ@ m G ^m..9az.MML2^!Si115 (_x@>0`-ĩ搦 lgdFN 噴UAc<66CH ylQ 16b2MC"{:j6CZc(J7w.wvLc+ y :ΒwTB\1`!MBPl`'c;9k@:9.g !./5GȳBmTzCJC+$d6 a!1c+$R Lh-oUUN%QKUD1 :h,byČ`&s\"1(l1IeSciT66K@dj\A[jMY_~N5+%*65c@(qH[z0/ `ը#;_kj< $!)byI+S#*DgVUQc%Gii"j8Z()46Ja<| $:vPXoVUWuD8M23鬭_֭ݎb_|[1ʣ8I`1|7E8qF)f!Ⱦ7Ͱ`ⅭfU;nz-*䄨G[-k.dj#k!0ƋA G^*-V%^E Hr*=``j>pLn2-Π&acEF5-t_4DLh2QӲB QJPaZ*97dhEYR:[&@܊`mmK7Ύd P?`yrx*NEt9%$'a O6Vx`V[tЫP%l@CƪڜףbFP4f,pZnkT=^p$e#s4pqإ6FuFM39(k $~2CƠ#RFdo0p0",0K; %XN$Jhƚvi`,o+,4LEרH RhWT*m-=+렱^@̤I@ fM:[P1sɺ6Z̃4+[FŴ+V祻~v16֙4DPC\tBÑ$0z`2o6:7  iځQf1xj]0!碦*5kk8h/Q7C|̨py@%\rԠDKhLT@=B *|DHz;`7Hƭz-6$ӫ [QW*ϕ+E3)ķ(.: ;bۡE =RdHkkF|Rz i,27hâHYa=+NF"5R&DndjHkԪT g xcEY,JLZ2)I1蟐e u~5(Ws΋"H*R3QbT98nG12 (5rfF8MWf:4zȕA}!F`J$nÈ{|tpG=`q^oi3φbU~f" RLYY 4EL BGCdN]hES@$TiҺ5Hx;g7^ w:ߝōr8ڵO`Ab5 %E(\펜}G&@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 34'N:L 6?&3`@eOzJ__Rtjۜ[/p2=~iӶ=XR]EtJyU0;d|D2ޱ?y1qٌ|dRTM7w1 >{0o[յv+\N%;ܚpG/ !?zޔe.{Z&$ \mݴ[_Y6v*xk׸ $Y{" YD&0,Lda" YD&0,Lda" YD&0,Lda" YD&0,Lda" YD&0,Lda" YD&0da'i}ִG7|~:yW_LVt/*0q#LoMsntxm}2?)?{;I{ǰ<^rkLS_a)/.OB^h[_pvx<s80-Qt|M1JZmɺģl{ ڊ(%Q,V`o}|7{)By{MS V.YF$3 QNTID-igRu;\4\±tr??4%LA(ۉ]cމLE}5 ;ss2>V) ,l9 =Ϋ%aū)7e9hJ-gd_uC}8 ~wW6KGF`Y ?ڄg~Wn/~ǭX6NZ}> ȥeioX:d,}YJ5J⍅Wc;>^pؤ?%|ǝԠܸ.[Wڏ وfͦ={+S"PSMWD2X1eA\x \TTL'&s5댲(Go\:Zh>qaq#*^׻wa{*2O6wyd<ߞ7#rM&1ƵA$qzJ?Γ00000000000000000000000Ê+OH2"\'>&SmVJE|1~/]%)"yPiqym`_Suj[}}*dV.9'|;?8,N]w{@c(z=~}4HGXLNGYkwx6%5WüͅU!1Sq.;C{-2𥉉vZ̥\[dGw猿 ̓y76|pQb6enR )m`Xç0BFVbN.?Oonv>v7}FNRhuͻ7ňΝcaĿ!5WMT1\Ĥ<5Z! U僶p:jX1G3gl>_!{l|}x'+ҵEh@O7Y~$ٮ0$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]$E]lգ*Ԥx5Y;j;\>P ]oCDw>:|8<⻲.XI+1h|  5Zf+i IkXuNI-n4딩/&G &KN 5n< C(^ܩBܝٿ ;xY_ݙ}Vޙ8oLuEb)D0Yek(" RH `ߘfM{C;pE;a'BO~M+!pƋR=DKO7_=4}Jn $ၦ@`KFh6iP( eY@4*S"aVup8AuK'Hi&K& =67sZbP&3tҖZ(0rBwFӎs_& |ӓƄtLWH"@DDk_RW)\qZ)XX&4)U%wC)#b ڙ5|_ GپdAow6{GC[-qD3K!`{%\qY&%>bwp9 ΂M0gkdH|Xab{LˁexQ-Hu8#n!7pe)鮽3d4FFc֟w}bV+1P|"(yh,ӶQ&H͙0);H|Iir:1{]K31Ylrv2hc9SxDF_>Ha} yVeX52]VdXպsa&.f~Ot@SqAP{bهiKs??RW"-j5Z&kcrfmdJʗxJ3!g2y7Ei7nS; `PI>uBǀPۧcE0iR6?WJП/e -. ,bG;ħ?5*G25J xxcRP1ˢw{I%/5=ZxC=!Ȅ`Z=6y-3JH,Pm.DYubR'!S1~Iqڞwmnܖx1}q+*;7V;T>,r!$R{Gㆆ{ ZcdIpD(cy3 rxȑIi1R41%4+VV ꬼHVӼȉY9 ) 7L:ZO?,ǻSu2-o:p3sfGhu,_-".r=1 SBhs.{"&sb+. BbZ Sxf]z=dL3 :,>,!Z'ia䩂<8)vp0WF~֣A YrZ$w^G4*x1'0{8ޫWƒ`a80Fhҷ1蘳ġc"RT[ٍ҉A'+ ^&MЁfY⭾u7$ѡq8C)e%*na9z#)*iHLfe 0~]== 'l08QP b8q5k]P$?~\QPܞ6xg_t 8컆`_t:;?OX>ivuW/~~2iMhkuϚMΰXoo6ccn7oO*y3w2;e[\RYRN/zj'Ǐ녣\э`ǝ܎WwM7Ong;t&2[|7soEX1nv{v@޹.W?ӓh2)ISIM]3C Vݮӭsƿwy?G?t2-g#qmwolٲ6v^Oǫu`D.܅܅=HΝNMG||t; ՛/ܥ87aG dԊRZ0p˽֘HuT^Y4Q U꘣Ӑ0 :e< =Va;~"aAjm0Z/#Iž 09dR9]&UV28^+#0m̫y_Vge!}j,/4 {U_q'B̞4R,КC:|i'^/`#LjdRƀ !ˍ)ȈMS"2ih]hQV׋4EEI'UqPĈI(l";~T%&g@s :q2D=3%&1x' 2֝BvlwAf;la1jw6h{<ֶ*vhwCZB <"s4izR*9zJ+Lۜhw!jܪȨ0f"`\X*."*or\K9(RZw#c=R sPVB1`=i| Vw9%ܖq5Yv? Lf'W2b^:fQPRJΕģ¢&:%UN;QwEZC{> &CRؔDV c֡P.+1fdڵAju8b4H̾vq.jʨIe!8 !i;ET#2uy&2ǒ4*S1X21CHES aM,j#a}CBH,&+aVo, 0s*#" 8 Y@.ɁD`-';s6\`0LJA:N 'P1mic" ;Ԇd3!hUYsC4FRB WQX$pP.ӫ˭xlFɹh*pqŃ۵Y= R{τȆrDО(@F 0' '􀋗žaq.xh @ؑuWF~_ڍuv< n~|%GfeUfTMOscKFkbc켂#OLRF6MCcQRѹ$G8B;Bի=WQ> %T\K9mLAP,.ڥVkq*$l*$둿 \ˊ<^j?~z=yLSv;Mf#  xQ, i1)U{|Y~Y$7SS=(úeЮ]ԏ f(څ_7OFj7?lMqdkLШ}F@ݲrN쳨;u#苨;/Z獩y9 *"3H$JA3)S;fxHr̺,Š3/R`jo%6gF۟S!6]I#i㭱:Ldё0 B4PRz޺Ru&`RTgmt(9_>{`]{]#4tuѵ})j ؠAbp KߖSmQAM l$nayNnE7/[b$x2lK4C'Fge!#OյgrZ-Ď̸(C6JЧK$STNhaOY`CsB'ԧL r2MUKxg>vWnyM"}| ٰ+8WbGk).ƎhX&yK6嫍7]c! \s , 0ɜ5kP@d;nǤ?i}>zZf,Gǫ]6onam/LT-z5(Zp6G 1}d&whK*eHTQLV95H笑:)ڗ{\3N}v <~Zy4U >hE5# )&7_Cn~.靈OoLG叺n:=qT{MTaS?KRG7:!cN[#ov +@^Ig]6h[6~B%/5] ^EiY(B| CtdA׽שkYԵ_j/ F!b)g98Y$!HRԜxDPHdK up!%(ω!@(tT(X;kmDцIKZ;ǭ_OQO֯37;oo* ^W dڿ畨Z LI0_K8hUnOQRpy8 4;л@` QzꧫOzǼ oxVxͅ|knM5MWwi}t=J1md*0ԙf6uJe;x2n !aɅw(gʸ<]e@^1i_ %3{>[ӝaCQʼn=C֓h 4K^V^>tFm6ElJ&ke-e` 0BP5;LjnDh:vF&x&9qv>r:Sz xc*@ۚ0M0tCvZͤO E`fx;m #N.L'dd9N#xxl-'@0ER^ȊK mˊ~LimJu>=P.NnѻG%|(UV6UGՓ(#eLf -f<(afl^qIuCvYjRY܂`5mȐ,1$Kz,c\!y:0EZG=PosVViٮl+Kz7ɠbM-`^]9L0'Q?ýraҲe1cg-pk 9Zr;AfE2/O* ku|oȦmƬbC%CbvԘu L3Oe*r\BV jeUfNoʚ)b-6M5[>&pβ)gf+),.~-̝ORB%7$JvX!jD+rCP XU Gm!lv9Oq'f%+yWhRgnOmUٴla}w(@x$31DI-VGId…00B!5wP1CD2!gc  3` G 볬l嬺%}qÄN8#R#Jqh]ս*[$Pأ}*\9a>P̏B19 ~NɱۑBkBIXPDDFFGĤtATZ*^k6CQ{0rHRFj i EB~әă< kao`@)&cFn9< [68|wj6ge6K׸?L?rA QTz! ߧ$܇އD)SSR[pX̭5įo7E~qw֔=~Y)j}' B2R?<`.+n&%$%Sfs0wSB6xј1{,|;Uy:V9.c_1Xf/<E;5E&dTaIإ .hTzbH+cG$cc0ZYC־d37| VJ⸦͟*2v\Hs`WT' I0ys+Nn(V;i|:vu0::ڃe`Rj{"`E. Q"LP*tNDJn] 6fڴEcmE &$r EI5ÖYf$Sik" $PAqJǨQV)4FƐ:^T!$UHQ+Nz ݰ4FζҺ&SX` 26żB[BD,)/N(8ˑxc&zO +M[M\ wc3̱`Lk.9<2Jl )h&x^JХJ < 66;Q[zT(~vB8m[is/KqǜrAvֿMj&'Ɇ J!L"uʛC[s814c*-לIʄ1b 3:" Ƈ"1V2, xx;E90mGs+:2,"#as'KF0k4t@CvFW¨+y{e_ƶ3Y& >ZBS >XO??&glS`hn)1H)h[O9|v`ȚGX2 BZS&M *P"4%0LغEXXH>%h`hp8Dd%qM#Y@'_.]?Ǡj7,]_)iD6~nVVZaqHBDNȐx0gI9wZؔ 86"zBj4Ӏl?d͹S 26~b[o[e$7#מ$Qk|m5|& ;:'5͛(` O341 %1v!<LLXJf/0 u9rS:&Lvʵ|{v5-G+Z#;_yi}aS|?=$"F"T3ͅ a 4 T-&% C^ !Ghŀ8"J[bh]XDQfW"xd[GZ#5|g7|Z/)s3tRw@T%ӼzOV8g2e%B^ [-S_4 DX8fR#%XxQ<ʷgi5y{&p[T&)B+@EA,XZӈIpݸ~scd辰8yXF(F>*)F(KO`1(䉒Y%%qMj9?#RXB%2)h8$0@QӖ]Ϯ'߂ȆRJ< pi /c|j|:@^soy1xØcqNWs]rJDH61WP(),;w([y4dߠu*%fb=v{8_qэ&}].Уꉲ˃ WfK0X# |"|m]Upy#fJwL v~J.zAe~Tί~ǒ~`~\N([\^z]/>p=B]7>ETQƒ>^V,3/m(CbxH4{M~r{yq4'Ș2#77צG.En({4, lf%uPtŪƆl 5TA < O, zPKcr/n*.8aj2 b}b6͠7)LS)*]|U\CAχm7ҚFm%jm֪Ir^n{:Ԓ ZgY5AV=&QLDe<훻/e[N9c9фw~=jíxR#1 *-V=%  t,`5Z G:qFGvyn:zj+P<ǝ%T2a剑 J.zD\hiDZ GFskD]e^XRT6;|ƵąGN8#CL$6~hchO2h +Uμr 6]YoI+j 1☤y-z$ln$QT) LmWb?).J s_/r]hke͍h[pFIՠm qV XA)\ɵוHh#FRfpy|d]1j$\Ѫx& ZHOJL8) ٚ.Lmס0O=yWh'1 vm :)(]1E&P:Rv4 S1{d8SwgDCe6xVStOY~aFq$kH+.kPؓK^otGH|ꌹm`LW0xE xCNK54&nd2fT"zY[kVLx>"iM:(lB2;Yq F$dEGrU uN L*D#fPCN)}ŒM]]͒gV-g8ȾY\{hopk#43'4$odO ŏhBG =sX# @ngg]VW|:եx]]O:o"^}Åǣ cT0;9P|}5$U16+-RԕΟnGdz[,^?y`?Z׼ji9ZO]-n n6/_qNw Ԉ>\['[їl-*Ol2(ͫۏ?cm :3šΑHYӪq>q> ObdG$]4 Tt፼4l 6< 9SvܥoiUim܎>MsK:u&Ȃ3)irǘOL1C&PHk4w䭔[/ɮSهFXc3}0cV3tFdt*$S?}14g3o{gɇdz3[.>MV#LNmǤZqƄ GYwP^XBUPmdD:uFx۪m.ЬļoLD&\\///PI)Qb*{iiڐN|,'++SWM?xlژZ1PŘXg10I$D}**F?Y'ti] 8WzsZM82i%eqܳp_i 8@ׁ> rv_"qx`~]Ɍ$9 F?*Tg TU*~NdË%|Y)y,큡$lMR336B,dH1X+%j-ǔqM# D兏r(UTE䌮H] %M*Nc`L3wsx`&zz\P#_ LI;tIP,"f'cJSDЊV"Ye (ULĩLEs§`9`u̯1,e'avY7ӓ^r;r3JM6n&:y/o%T{K mWH\<( (OȄ\Wl$i9cLVX3ZJ%FQR7.l}})e+RTkLVi iƱP7j'?y{WBq /U~/ta<~_os %$3ʠfnk;,W\t$w>1URZ|* Y ҪMm]-d0jZx, mITDrjD+q{8 y6A[iDZ^{m`wmdFA&ᩰ#0T $JL6YWJl\[Xu!Eo$0Cfd(Y4 nHd87BdT3j];8aOW)cx[va( ,xFYn>\_r\NپƯϼz_?N^31Y)SP[~$BH41LE$}ɠظtPˆ ȒeBdЃS/Æ5AD4@9iUȐZosk&~#^ݼI.Qw`:60eޖrОmO[O>u6V1a{}x]_ŀ;8ߴ(<嘂Pf16aXFif;ݢwvԈaGҞTK @IF#ɢJTFB(.1}oߢwZbyPWa/JY( &+!*NE.- /w{"B~^,uw_ak IZqqR0?{Ƒl / d~T &E.6-%)z0Ȑ-kd g8stFFYfqm,y BINΧRIUb n I'9~{R(gV' 0qPKW ,̾Xx^k??]H\p)ub5I%%Qj%okcZ=^v //g^۝Deժ3gpG÷ܮm;ϯNx54~mhI [:Ԍ lnf]Xސ"4>omTZ ʐLD;Q|mrwlf1B(ԣPP"պ]Y_̗yӹړկ>}qwsyͼg,5ՙek[}70PB,&۠RcSTDBV(8\^3PRuB0m,kaɴьŐ)٠Otf @]dNH\CNZRE2,<i^;%XZZwvȎ2]{F|1c, '2daC~al  f~BbzZ RhZ{& {«\uqJx.w <}gcoxؾm>A3aiel4^oL |F2X V:X#U`^mw~k3.#8G%T;s.+.RhL9qJ"0BӾ&!fa@ ~<Ey@4ojXw}|]~x}|yW†t s}|L!"MwM ]o>G/ٔ/z<&M|nmbtv8nJt8d嚷Q#WDx8&,"d/e7hd\ukg{= All`, r g\(UH"j^*#zN2*K߭)DG|IT08 Y( yZY`00E僭j~u ukόt)I;>}rE1&G@V4tE>}$c=X9Eia端wjT*"7.D~ sSvA dC{y-\mvY]_lO@^Mq~ohhta4btR'+x@ JJVѹD'rBl &1kK/wwnaETE-}Y%8rf\JyE`29"äT@ *N Ķ1m1]ֆh@bgB h!餹!5E$&-Ho8`];-D:]6 ^g5).vQvZfwlRDT qɐGG  )$! ioŮjұ=> +O"C)j:Kgu7I?To2& mfS"!ww.{8Ppލ(ْEd9-Ɛ0 u[[C={/ ⴆL8m6buw =>DxX@l98%jd)@ć;W(9fBA" yU%#Y' }RHRhH&p)X;̭Zw+ u-j.PE;0en~ӖByIWԑo!9KK\w-=w}$QSR.R&'eøgL&ˉR] {8(ب[ ΄ٗޙO6VfAe.#8W.e b7L)]l$qXuy5'ڗ9hL.$cN#8N-6ڎ<պC΀ML?s\ˤeKGD#O^*MM[csSChӗo8In(ah ,kwph@"YhP\rK :3gZ4j4׳cZI:DZVM aqiwVd<A cU=,{i1ZSJ)y+6Ԉ"+ fZ$wUΎz]R3`y)ϡK;6zed[Ft!*VgjfA'CH^)0;G'W3'I/E 4KɼZHL 1zͼZՠF=OyJyTA(BGtЀZ'Q/'-|K&h-I kco8i]^t&4LJ6jyAxbh-D*&JJ)  `> ic8o4CX&|,?T?f6e\i|Ni0DRZ[T-[G-x.E}6IJ`Qwyu0DAߙ]Nӹ?~a0d$k 2ր׵JLrl=̪dLޓO 3]yoG*g:q Y`AS⚇C仿)EQ#ǀm3Uuuy-x%<@rFϵ,wJÃ&G-D]],IQN)93Z&zGEJ M9WTW_Nb-*0?J!۠LșARL ,~*٪ǩMHWqm-l> )Nqh" B8l] Y@SJxxl5"~kNm#*텟Tm]=c&|Pxcf%am?$콛cke(=quXח `\c`; R6TcŸ/ e?xEKȐ Hyp0<"ؙXpr)ƑDK8*Jd 0SVfRv)DӔʊ#00Q#HN`,2K0uHH~VzRSw˖2V~m /SlZ N{ G lJU}u6;}Oח#jr1M(':ZI%`BÛ:EqF0cpUM IQO|> oS,v|nBr;'pe&*%[a&Hg@:Ld4#; HB( Fk:idpOVK8 t*d=l XIܿ>i]QxIMrzz.ŀm5adˆNӸ ShP^dXgNn]H$Ǡr'!Wrꌶ%S](>y|&;6D(3.H( P,@p)62ıG|"R\w*u:P'X)(tS=( 9A lP[ΞI& [ZҌ#| ,!u Q=һ_ۘ&bvغ `J bZz q:$J8BQl"L 96טt`ultt-_Yw HSb@/ I5sFXGGg1,`;"9 T*F1p) 0HFǤ"0SEPXDI$^ < *C%L3;.a=x]VTÆD4,rj<R9;m!UD![?T[\[NH >x@HT"D$b71RdRFceZmYAv7J]V*$e"0PY Fs5̰wT1 2yiH Ygv@$9ϦKK0ɆEҽ(&+ߏ5Cz) ڸ{gt0{R\Ty!&(b< |qu>㔑uc 9eI\\Υb:f<}O@ ;%P0 8+:mqp Hv& `a`Bct( &E D,Ѐrs&r!|ʮMHYjr^ӅW(q,fdwNoA7 Ly)WJy2|=M|]trU}z%F)ͯAu/ʹz1K`I) B5=Qj{b|sOnH}7ef9Aa0,tTP]zf=^w'Tmmu6Ⱥ^ƪ1(y3Is`7Qt: > KW=ҨxW63׽vO?]|Ix&?^_} v? LK7H`$}?>kT߬k]s䨫 ߥ_k+>jiK, ڮ﭅Qz8!|i$8H-]<  |g_Ae8sUUo֩/v> Q! k6ViM@^ڊ٥THy(Y^nN&!iܛ~udLmwƟnvJuETb AH,0 Fsm &q<.ZmDx+x<Ѣ'~{2xz+(v mm7nܾY-9bQM9DTQ3FXnùE 93_9TL~(_d5͛(` O341 R3l/Bb&& i#o\ؔ *L>z]'^' j[/Ww%"F"T3ͅ a 4 T Fc CIxy|(*/E1#R5V.Z* #=EE`1RlHkđ6޶`;OKz3$vmnf*\ E{_q:d%]H۫!xk (2pV`_&1)Zӌ&WU=<扆%b\U§K hc@\)G,r gĒjF4O_݌k#wɁ5NjHebBbmG 80@(UR"הVꛓ#v}DB_*]G!z(jڲ ?ȆRJ< p&A>N5&O-XFBX 0 6:3fk͝$bhp#؏`P ު}CE }Qjq.X(쳊a$ pa. Ȱ9;1;8[Ny8~T|>A<{>jlv0_kͪ:r~қ_y`[Nx⥤3O?\V,/2@ ׌D#iW ..gќ|L d-rٝ\Mm mz5v9\¤"I(Z8l5xAm (lndjᩥ5[bip %ӝWazȽh&΃&6_0:ӻx;зӾ~~e4(~,/9w/F4%`MKOvx&o}Ҍ!" " e6g?_Q7f?0 o+.ӮϾ*=T\Ksm.:85C/ٻ;;,9k0Q1ߤO8#H\w⊕DȆ?KQ+PeU6R q'E15qM/y/M)+gHjvu[]ZyϮZ-n1w %[L<߾]ڜ*"&z\dw7׋$5Mrvgڶi[{\Y/Jgun}8Keg·ಆNZ,TWm>u^X z]ow.&N_e J+I|HFLn5I\yL4h&{hLecXb& dCB`+Ff<߹A8E90mGs+:2,"#aw'3H!]Lݠf|mAFRS>Z/G2c/&um|Kx@Q|尦W%K@U2*[Y;޾zj!GHRSRjI`!\,Ez@_˰[~|jXUEǙ)2LӸ/Z[Ob*z^.T ta%HxsۡPyoiTOP?'~h.`CW,X9e߼&]dP+R N6>橌ޥӳsI[z4FEZw!M߅5*Ur[4as &:Oc.*5X@Ux9JTnq..|Z| /lF?|vs%Dv`ć29Ȳ_K.gA}d662܌[IMG*lM=h4_X׊wP[ ߰VuA6at#W g75g]Ĕe> PΔqB2 ט`ëĔ1>:T^`nJUe8t6v{\8v/S #z!Uxd!R&R/5eD$qA #(H8H?ݽA3tj,Ţk]ڌE[O9Nq"#?궢rz쵲2aDB!X&HSE57V"FMsFV;|zy&y|t==&0޾7Q5f]qa(\c0_9egTLkq(5:R6Tc޻,}ix,N G'8G" [wH"$px( Wؚ2֠R</o;cƌL"^ˈihnA J):$4b9,U,Evc!f`ꐐS ae2hC >Pɢjp•/fрXY *?|*]6RBG:ʒ{[T5s&SyQ4rm՞OwY~vCُIgGMgct|v; <̔g*[d֮ҳȭd!yU%[B()kϮW$z'Vъ ny*I q1yly::ZwH?]|@Cݮv>qM+-l2_su7y~. y#wzב;#3}4nո00Y+s˭:TF=773'z?}tRMyr FbJ !D!MGm|fqȨ|'a>Ά_k!>v|gpZ:'Jh&\mv RD[(M5+!M,L1:ZprD`ӦYtMٕ8-x#8Ķ k/־``fηgL%O{T 尘nbqs;$ *w؜r<ϩ3.XUw!Bw!ɹ\H~.)7EAVwI ql=Q:߈ǝ$j)tr Нif & ނ& 3Bs4v @;gLռ|Ji Wg=UmvѧĮYҖQ[|al0m\bzLx yG R $+ɷ:3N۠Lz xE>TX(4i&%)zU>-.\qXtO{0-`8k0H8u#rH#FH3m1u9`eK%ꞰE6 OE>HkJ LPN&jgu lqy2Ô*#zƞiA}贶<\ ZfΣh`Q E=I尥1`@=B`u&KX{89as:R$D@TD2'T8:(͸J&fCd '!c⇜06>B zXPQ1(ODOurDZ "(& ߫Ozj"0薊+Q=\MX֣l!=ҐVp+7YD372ŃD`0Y:gJx(4Zsk$bsJxhl<2 .?bQU.#_$8MHeg2bIױ'(C15.H"XaRƐʀi)޻kknX[dA-=8nDc^ )( h6)`L| hi{!G/c 9SDzw,)0Ë^1%8ꅡΫ{(3+:mqp Hr& `a,0 wTn*, ٢8\λ Py.SS*j`d*z&WlեrƱs[.k Li|ick ,!azRa0LUll:ǀ(  NO`:}|߿!ѯo?9zÇ#L>:{0&~0 ׺wZ5WwMۢkN|j闷 ^$-R(=~jRumr9m^V~zHf3_1-6]E*%W?^ Zȗ*Dh)ubch#zdNbWn'IBrqg0P l2Z={Jy*:h9IOpU^^J_b{X+0f﹯cMtZFؐ(S pv33' Ӟw:a6?}m̞pW_Cks⒯mi١לb޽Awwgǣ ʍQW:EAnU](Bj1}Н7h)Glj<qs^M[M\ rwQTk0ǂ1#RqTzg; KLs02d:_kd/;&L{ðΏE" [pk'&7$QtY;5&)*:ƈU,j z ,(2[vE iQ1x@nιQjEZ)$ ZcvP:_HNքQo(TS‰C٥t|0W p.b>ĤxXR7^Ϧa3MY-T΂#"[w!0ʽP?#$,+MNǔES+1#B4 i&Dl]釭",,Yydc1E4^JJd`g"XZYظVk܎@'N'/XVSP[&ʞKxTW |^.|話[A>ş.=氊GTRi?Xf i4.cCVBo/(<כܲ) [ ]BJm}LkRTbYr#,7\zA"왯ԙwΙ Lc91hߴE3%f8x1I rhb}YYI(`bŠOV:ܕs.Aer?\p`oFYσP\9/M6"OXN`+)b!B5\XPN`Lh,a;+ YD%}z}_iaޘ9CR&Z3Sbdt}\%^#Dx0ד0 GJv|ೃ(R(s-1Y UdY :1c$) D6(}?W&$%ժmIUCr(63JLə6/y Ec"8 ɇ٤WBM|@8 w_Dݛ$i ɵ~ 82o%ںdu:ru' (aK=5HPtL-Oꝥ( zQ7Pm-䗮FH0YfO쾊 KD~کDmbЋj~=6h Jq͆gHD.1ז1>3Rh:^3]FQ^ߎ+ر#bZxk!x+P<%T2޻'F+huDqiA uuJfNUimn+AClOC-U_p+R?;kLgЋyEpzK,RYr(əv:1#L8zBd Q;WQO[5hə:ASt'4 F8xj6Z)1X\VCQ{y!IOHMLc1 g jkcar`G@+&RcBS '{+&6F ˪E6l'Y208 iQr흟|SYffƓdEJdQ.׿vY2?ɊaNaEl7{Wh}[~Ke{gpY]6t2BYlSbvdNk] X ߷+pqj\8d}rG$- {t%=Xg|}< խ.2|KM脢"FwÉyr%qf֌]HK:DO1·Y)q E77߽ ].;=c^n;If)jVC_4huxLzE`'(S;=SadbΨ:՟kAc9Ѡ qFSꍌ+$X2/` \] `\KunbRD`r[WP\i:xs'^ iX cb^dsB/̊:i$Fs"AY9`$ \X.\"qaf>N|}ǬwU˕+%ɵ$Dm%9[Iwtg`Oc3Ln}R4H rv'yDrz QM@V@RaS[R>-bL6:kb:t1[y;R3Ke`Z 7""0YB j/LY0 fcxi|m…VM>$h=MR k?a,%V&LvmH%1Qc"X{-.-W]uݔ1 :72@Nk͹E $S1T$RJ4`*=K'cO)c;^uE9_f- n@[iny68OZ߰wPF\ji`q[f&<8sɵ&BhB`inۭݺ{o>־vJm=q*XJ鐌DE Hf2eЌc %1rhԞ:ŁC[Qkv>ӂ=GOPJs>6 L<; I2F N[lmNZ; lw5 ݵuwF3]`h7 9WT(QC)3(3Rer+gI1Jt%7&^{L9U!PJ{ĵJ`*$6D ,pL!€1T ,逰 ojRz\Tk Xe[FJE+ 酔L)O}*OX`=$h3wuтOM뷖n];tLa壡#X&0ਔťt8?s-ǸΏgqJ128޵q$20Ȇ3d?@3p13lՒᒒu}}z,IJZ $wwfgj*-P H]Ʊg9$;4@(wK3~`Cu}O%F&먤|P LE@ED@1ZB݅zJ, 9-3}3RrV&-y6+POy>\yfPx  ߖAt|͸ȶ_gӗ-1F߷#ŷ9U=XꐻC6#tjHv.}Eَ+y`۰pڒUnxa`PA=N.-4qfsGAp V#$ǜUٓʀ\!>V >v >ZUsFju4vX-ؒ+ ju]lqQ۷՞uNk]"uc+T(Ycj+?&l]9GBq~#ૂX{_^?QF}>j}//grxgLei}l*6D1Gr!ӈUy-ձh0W>ZR+T(}I*ݑP#)0S|xxN XNrEO63٧9&E)2 BP$U6Uj[N5' ]Ah s U>AE9!3:˿QÓMH&>1+D6` BJA`Jڪ\2A\5ӷp1 (4H@EbWNW!M Y$uřʮT ȹ]PmƩy<-xg%|<NOUw'*F6z"EJ,% )pۤVzknܮang mg]h{m\\6d zūr=Ʃdqk\\h=Gl[ւhLȁUl ſ ɩo=zmbXԦ$iY "oDUNU)"qn9ɩ_iq_4b7U#4A#I媋ѤU@fh5sЦ栬u{7YAqZ@1j|XmKS%Su1%O@)hY#v#v85zOYbisbݸdW:EЋxrmT$p'mMqΕQ _P Q2>  X\hA/އ^}؍;vՇ>{Pa^Ğlb3nLw~4ُ^Vݳ_ xv+9*S<1Οų( nGyfqڱC"$B=Oj!ޟDȝhYZ1 Pj z(D>E*:dQe?nq߈áPe=(s5c:^Zپٯ׍;OLŶp*]Ih$;rG)9"cc-` r}<}7kVa.M:aֹp_;m[_g-_qÄ ۥ~wD BA I1*G(Ԁw;Cx臰̂YsTX}#819*} 1JA94fOU Co" r+CL'#[˘9fT >ql!Mez't#tDSХؗ*o#z`Q[9LjF:VvrQ:W2z"Yʾ*R"Jvu˓Ӂ}.TlXM cOjR`DҀɆBو8vjXQ5fV4C@e=.52}fr֍;$W'1o,>QaM[cEq)]8[D"ET.E][QB 6wՋ/Wz?Z\Cנ.8[,2\i j'CJ('r9ŴFbsm[ً8(jv׍'>fw|;fQv+%E1hg)z\J,Պe`-LR|evm<`轩эEv>,|wAҶ$*Ɗ Z#(Wr L R7E{V cW7U:\gp_cr +$r82|d' RQ d&&ȀLabqCe<}t e2 "ǔj%1ck0="z7wWvX#<ɬ=U?4/ RvDUĶ ?r'0霷6)'\Fٸ;- /fyɟK./ zᭇ8RxL*-Oä!SaLޭRVem:]܄m}޵~Z~4e< pWG{n<6b`b ayyZ$Nl<> 㕰̆Գ·Y׎<خ,iyG($h aw_!`U6Џ2S-d"߁7,:xrh؊ k-cQ8 v(yٯGК}qܑ"DZؘUY6ɧz0N$bobSVr}HU,IlRoMOm5jaߊb7})%.Fw|t#玽Ao,!㭣x'dZL\єlSjeg R0\JZ\ QkKqk9-#~)f䌛QF|ڌ_q>?}X .l/SǏ Ͽn٠9?hG?\oq߭AwatvBlW[%r6<9y;?W= S%:j1VSz&x_MWYopjjz^ iN@jNsyz9&RQso>_4/{,b)@8uAX]+wvs|y2+Kdg0z^8Q[cJ)}t*ʸX<{uDg8 EsΠ$ZtӉG^^ iF)/׍:㩰GY9-s_ B{Kohnu)6ᝤdWێ#KkPDj6^HPs!>ˋu@|)OFH]VAy+[('&-ßϿj4yO'OiA\ u[ "c室{t/N.ĕy./p/mɡ}S׃Ǘ{-P{L٘/3 h('V @tv1 ņPC>rD;/LEBЉY 6G$ hg+@Pׂ.֍;}x;Sr%%S.eL(qee1iy$p)Uo:o7ơҺcTzb3H(!*EECM=Cnrgclq1e d/Nd2&Dt. ^e j b @yچb9عEwF~YY ܞ۱ mtk 7e{ӂb2p~@ TndΕ1֖Mz(Fҍ{]۩=-DWPR63'j#xk T;U9 xmt mlUɉ%Ďr:|s ["D9Bkpߚ.Ǩ CJl))ȍ#&Q)sZWR,%Q(Yt#}OaqsZ\emJ sdf'&ZȮc,#z!Ss, ,",{cp'-@Z} z&.$0&ZD iB EXTZV9.e:-h_uc}:|" C` xse8&ًq~vq.ٶ \thÌI}.tRs?]jԞwQ{ץFQ]jj`J&Gl{hP|D JI vuPujt0?ػ7n,WD1gb'fC0wzn{`JU,۶RWX8E[D5&Bhq&+eԶ'+g:vĹCOyTBܡNMèܧ;5mkE}=TNZwXs Ȁ15F[֣7hVQX?iHũB9|`|n(Bd#,WY(%j2]+ߙkx%[ZR>jD2e퇻bO\?`5byuƕs>/6\nK7sAy:k\U"m,A(\ `Vl(ۣτOڰ]^rhg|lE+^3ow#}?Mp6NB'僛L$R-'ogǯ&']ao]N5ߵ;)V*%sĊ,))OTĆRr𡎻 xW/q>gr>|Yˑ/ [[_ lgǗ $BQ#%%ٖ%r Asd,y.?"p s _c,ȅ.%\2DεhkX*喿[viLv`ofp8|ϯQ$k5w˛$idO)إ^[xJƒRV0swlBvQ .+]ǯ <1CCW u#P46ąo Hxi ' h{<܍{g[_ ΂@za!Ԃ8R m6Nk9{& 09qŏ۽xmoZ8dh:J#b~\]0_aAǓ?矿v#T g&dڢŪNm5YDʇw/xc;q&/NNfej:ipRN5+3Mlʧ]&?&ߞ, A[2pSv> t^w;__2Z c$FΤ3`-;:VJr.nKaOK(7Y~GIp[bԉ Vuش_۵O|X.G-`&4m:Oż+7yۭO4HmSDx)&@&wmϷvTΔLW]/ϔagӀeXOBǬ"\h!ݴпv[/Ĥ'/!ś:*zQJ=.~kPq~Oy/z:urߩ+/b]G' ݭR vh]\g Z?BM5f+>̫F|P"LOg,S{K:f Г׳|/'gu\v|']+b{ :_{SepM-u˕eD#%7ZO7울;Qsf 1ns?g7./I`fYz>Y5MZ?w҉3vIfA b7\tgg]>'f߻ Vۋ}oŁn.4L;Os,kWM|ǹ  @uvϵ7y:I4&3O^7X;,+织Lyd5SqXڋ7u@''-ibÁz[c_! ӣxS1:Ok4BXN1)'Z)ڲU[Z E|t򁧋.|@Ƚ\jψm[q@9}.Npeږy|bw7|>X}^2cDʀE rxb$Y&2$sYES)ͷ ğ1I4/:1e7vOSm]]5b|#BymЮW3jZ tָ[O qfLŁC<[Avb0G ߞQi_}+Q(I$ZAZ2)G=؛|/S8 az _Ζms̫@շ^=daтs,DՖT8%R iŵWg� JhYLƪ3u\P)&-=(! H$bmʀ.Ɣpa:Ql̷NΖFH?)*^ SZ = FR f%*kb0ɺ젴VM9ڞwCT1$EU#NU Cdi(I1,ud!jIA̩Xc0V6cVm@Pʪ֒Yg>Ԝbցjcw PfyXGA)$%LE:<0ՑS:AZIhPQɔ* FW)3Ve(Z9Tï{! ZG .FLAik>!|w>f!0?%1ɚ9&:3b]K-PX,"2Qg gTtg2BQ"]40苆*8=jj 0"|0T'Rt&bt]Zud8E\lmnkB#X^bhwu5 $W'# D4PP(Rc  e[Q@Hv'UPCXꙝ6Ai0֙pEb^"B1: 80_c.*N`NR'_02Н L,.X^%p *dK򝡚;OP6x-X`-\P8ҰN>n8@G6+= ]T -*``n:pL"&aaң j.)J H V4"ʰ2G'0-҅aa|O\t辤" Rnq+"ptC VFES ԏg8@WawFV~j`%g ^YНMV`vTf=󈓙E4Χ,}XR צXk#p`Wcڣ.)g/F9 T`%\HbtPm5a H%B=F nsp1jD9 ѰcZ x %@@!Mv% hMD0T9YPZjpzPDڤ pdI*.;aamTgT'0S" *" <@{pw FQ 0$1A$KYҢ NPHgR-x|M%6C:jϢ;a(X&%rl[VjWKUK,Q2hĿQ.AAB8|^ Fd@0^ ŴMiXXq k=[?C輜M_^5r6?ϵDC0B`ơ+FQތѣAmk[N$IN~~ڧ]۝~鞲@ k=AI$ HʭoEDfzGg`*m>s*0v`)ju<5HUO;q`*?O<7o 3| gpr7%DM!u>f% +d5$V6] ,w R j , =_Pd[e`~ #dFhâHY{8!).F+725RC4*G@qdDl0S D2TSEc w-#6W|=(Ws _c58}ü ׁ֐i '<YDea~d_,sli65 zR#DwH$Bp x  2Zu>j%uLTũRCf9,>b2[_f-bJnG*<<%?1~`x((7>ThaP tww"%VU,ǦE)Cul:G~]󒖙;m:PXK,$>w;V\>%f1xil( -KbkG}~ctJҫ7I$ae@a%* 6/yEw ˜ j5Zi9<=x:߹{ ,1f]tKo5&a=R1KҼ.h8ey{Bw7ey< w׽M?0Wd+ϒ uefRwc}|9> %9yQesp:F&Z&gHHUBdvJ0OX̆;"0y}E/.e53SjzbcUHEgt*ɚN*{K,:6ĨBMaGc]|BN]?&k 8.K%$}s[DZ9Y> yq馮~gGpvr {j{/6Qk*bn*lymga[&@mg㶳_Feڸ!]cVw!cHHQ\ v}&t>Yc|Ob\R\|n uIs!1kF|bJQ,Lx $Q:S G,R4ͩ[T7̩MpN˧^1jN<~} .n뢋?ag]fJt^ 8"tPbv:b |*`xQ',l<  n#z=ZF+I냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>H냴>~\c m~/t}7`nѠqj#_Ȧ?贋O֟Ai1ʫ\],.t6X)W N #"ሞ{?fsso/q‹lYa Vf;3#XSON"q}R9JV۶nQkҐ3l:9ӏ4,Y`nks:yrۀ/忷?^$=Zpq,{gVr 'xFP*!76д"܌; w7_?6#5Ē~bvH>'FhXr|Rߎv͎X-zNc kbGTԈm_VeV*x^dYkаV97{ge̓d!ef..mdKNrTbAf_t=z.l!X+yկͯ?h9O-ny^O {\/5C |pw)RWä;coϻHt{c[+S9@^?mHoC6iOxnrtG+pʧ?>3ę'~u[LԭuAS)yӥ Lk oE&˼Sn@V \$Ec %Kk WѺ$? e0ecF pdX; 8m((l:{KV `7N!ޣP} 4?k|# A8o`*CϷj.U .Mk|ғ(k@?_Q /Ox @?5Ų? Zϝ0ݯՌ|#Vsi O,&01?a,VOY_9LY{K\hDsYs GlGl6uW/.⺤fVr_!H~qqcNpe 4g<&D`$ )Y)% W'KͬTCw?< 4=֊>ȃzW50S.o)`D[Q@2"8~|G \' u(wJ*r>=A2Gvie]NIsi_樋J/AWSģfUr^!C%H޵H)7A;CE429`ec /!ךiX93Dpn\&RK BFhܰ6Ŏc n{PVa ;f Ae$b~r~]&a'<wz)uJe5Zl %2ϳ/Y쾏FE8Z?tt$": \Fh==@KAr%e#0x"wOzH2n$~^AnemXU ϟK#Y N(w|˕*7;jZ -?+gNVR6ўIe9t ~{]캛iq?,N3a!\. fd?~@+vv}2Ň_w 8b+|63ݧC?\~`UVS u׸Mp4nG5k^s5K~"\C;mofl-V~#ޓČDw`I H 6+j@;|ȵCűPsJYڄ{I὚}:GW@Rߡ{v$GOwyzb+_+/Vo27>O}m{;e+j8z#U\gwPl.wrޢ-; v`n%{H/vq4Rps>xX\r ;+ԩQlj =K/ZKe S01dxDMGG=5+qg-oK턶R] #[:!az̨ODPJD,!r6PL%Q,b[+]1ޠp?)h*Y^boX1x+¨dSQRepG[2΂K3iw=nY' 2h 8MIv1wMz:%ERݻᆵHlQԔ[rh I꽪wޫ㉠X!;< \>kjtPaҧ'BjE!'QHJaH Lhk=rSv2=n%cJP!E8 H-t.T3h5F"_sޛp 9 6<Ӑ~wcTgٻ^ߗ)klxvMC $:Vk܇QQod\*&oS 7E%L~|DMQĮc"y$}R܌]`@J \TSwa&pk)ۋyn_%P0 $+:mxLH[s^L.{W$%) i"cq= Xdp_d.?ld CRwUrz;\~S83}9O.\YXSYWܤ˾P&^io~(^T^-(e[7rl^RDmݴ<1;PMOڞm]7VͲEa0,lT:_Ux,ۘ+wJl애Θ뵎W{zHY:pRw_1KK|aydR- ecA:s Wͻׯu/?{;LԻWy/|#0 .$#'@O{tuMk5'mr] x~y,X}ԒK,ڞB(~rM\LavK[^M2~drZpUU] |&D8 tt[cU4WĎz?x$ iPTzvh1T{)Tq8`St'})N֐:"F5ykA:bCL{E"L f2]@ 3 < M{ޭ봲3 ӻѩ-ךLtN4՝?9_TݼtijVu[wg!g8Y61xԏ-l^r# $g\Kr19!f8k\i RQM4n@q70յ?j? ZS^(X3$Xn-9RCu`4v4bG~-_b|>:DI )\FQNz@aH]Bi!42(d4VrG;,qDyO=ui#hXټ*c{Uft>7wӛ+> Nu>~-{)b?$P+!M,L"g,9K Sj8D4 csFtU|5 f'GUUMONu7O'|y f1grHxM j}YCk.X-6x\ s\s S]fr`^1/ _H BaJܖGN.fMq% Yixz߅\e%Y%77N"ᑑ){kOξB(kBrf )ڂj7t08y HcW{1G4xEM#K8_/*nlr lvRM$C_Ũ+ҨPٷdoy~-'׾R @2|&\dK_؋_?ђ65=VBHg"o H.!t0 ̏!Ǵ]L/N%gsK'Re7vFh~e`?p6ftĚ LUsױdVǽlċT| u*|5-D|eP} ?f,JSS*v$\Mb8y{٤Q^ǚn, 1F iOO̥=|U;&kK =mt2*Y#$oh]û:|3Oi#0%_T.CS&ENݎH> й.&R=|,I ܙ⮜-;_.his!- \AOբV^ZjaykU8 <82I:tf6by>PpږRa~ʸtlӞy.)9VEîl@#֭,aGȲ<ŽI-;!mvnmM72]5X&e8p!ϵ49'N6QBtrmplYɕOd&f3n4oyqʷL5}lN֞#_]rX~/4)#!% QAi9B3骃<8kDYG)Ə3c ³ჺ9<`UYMmݏf,ohc6ؽS-JH: ~H|.i>Tak2oVlPqK5ُ>Z V nNs,T(NEYΌ}rmȱZʄ2at>jb w -UTsc%bD!W7 GٛqEpQ/ݺgR$*&TJToB__/sL fpZ$1ӻ"UV9cQNt2-@oC~\ITl棆6ٯfpjXta2 S 9";\k7Z@kJCJ'pce8աxU%HW$ͨ(j8"EQ>8ȗ0~Jr_y{|/* p@JeNEޯ&TG(a%F^]DCg{ 픔Nf6xs48iMv%q67dM en6CZkLV&4QL k7Z@=Ֆ.O_ۛLv4:f{F߳Gh˼nG-/>xuNAK_ntiO}{K.xn"`w zVhDGNKq ,3gno|67 |G)F-]8󶜇2mϔiVY`/ON5%'Zd=qs+4!;] u[cR;ĬW逰 sVk/;^sXe^'K=fܔ9'2Ȯ6t#nam|v%ރG Ц{@_ V~릾&Ok"fo~eUVk V68ڕUM;,Haek݀TQb^^<_=b=0 6Q2gHZsJh5Vykؑ<5!JBTHq7br FbJ !D!y-LN-WJQWJVvl r4}xvl5nՄ+zXʘ~}}lwv RĴ &8r%؉u(<XF!Kت;y`h4BY&PfTB#*$;dL95x#8-Ě\WߖSI,) zdY,P2~,ƙwn-$@AN3C9uF")хG !BH~B!$?R ހzf!\!o#C[OqNq"Z @u4`NRQ;0@>,܃a0-j0#4'(xAc5rq*)AN~ގfpl˯>;Y6$dalN>u0 ˶US.,>^-)%(_s ЊҥD*r5.n$XA "!݁}: Z㋶&S5a4%"A('aq[siļBB숤(PGQ*D1p) @$9.dKK0+s4$)>.5JPM _&CUZ즩aq'\rLޛZDE츎6/*}ؔc@2 ObMUs;ܯf Pͽo%P0 VVtb:L&6I]r9/aB$AG'iQ9S|;6 I dA_MYͥrƱhjC.h`JMos+` ӓ+35y2LC}ur>;k.(Pes`ӳzn^횁,rra!2Kl# Cڇaì/>YF4 Mބhxj5|0jG]ĤJn>bR6^-/f3MUUɂĔ#"S!ֺ5r/Ob Dٔ1U5 pLzR!JD 2[pb Kp3"'7lKI LDK9 Q85ꌜ#V2=D) Z5n4>jVwK9" ZE! QX"Cb A b$%i&q<.zy(Ed~^i0HӔ~^1o~LZwPj%w4ׅ%1J#.$-Rȩ^DxYX*qˈWU[0S`'`)6g_ $G +bBr1R:[j1( @ltk֚;I( !GA.&t5{41`XpߎO O~ϓfqT:c8g}N2v2M߮a85LEf6+ߌWJ3Ƭ,4ҒY#&g(!Fs(~> ppd>ŋ(RK-1.Y ARx ?fa|دeyZi-[yAtZz$)D2@F^-{NTjnGXE ӚeMrEO9G+KrnEr:3#褐{mz|}k^צmUm< %Mtw6שyω0%og$;e߫vKGMt9 wKK4/[l5Ɣ\!$ q~i}nY"댷ح_֛E0xes W@}}P$DvG:a՚oӥ۸X02Nt/XDg/ʛ'm~oETgߎǓW@kH_Y{0ly%'f7[kmhp-K4-joLL@Ihu'[(jٵwF}^mxOX࢑5#"ua5^0m^sgoǼM/rI9[Zsv5Ԭ ]Fz}m?7u*(?v`03TI-V؀)axB:i%\pZp}o_|::xy,2plQk-5&Dm`0 iS(JW?ݡ(ǦƓ_?/x:+^Q.׿bY ?.aϣOXqk}}Ѵ+~K{NjkEx8piOec|MѻǶv9 }GIz;X}Q_ srv>V5e.0Kr.sVW V$|JU!脢*tñ{WTDߏ2ZE/~O"ƣt "~aoA-ScrM$i.aQ ԉy>r_̾ѣRoYScBB8S7 HrAqFSꍄ+$xpBQ\Zuq[ak̒ҫv2>gi*40ɺⱸEu6奓qcsC,֜[@2%[ 3@N.EdNpAғ;<<6u.F4~ xٲ@Լ?{WWvymIvչC Lx~,k>[\'/>~PԪas`H\Z4bo' |:pG߿)j3|Z! ^륿~&ÒTa0}Azn2 0)ڪ-&Rf Hy!tpB(đh=< 츉&gHOSbV>o&b(5fcNYGIqTzg{ KyJPv%,rdGak}s3[3]yeLG+j8:~2'#%秳.Qir RJ@1)23"L*p}UjMSNUH!JiV SS\Æ1n}PܘJFEL̗Kw/d˒;a=왑%#(il Ijꩧ*Uk WGLdvYPcLjs+{4LaOsU2)[\Ϻ[u_'~V޿mOJ7Ýu{vLȑQ]6o-ve!NjZJzQ5Ev8Lu+A,iq'zg>Qbm/uzHס-`GPT=DZuR ܄!JAPH͖Hw9,E܍7{/HĻ`KYC͢dⅲJ[DNF\NErc/6=OKq. %TwD3@֒ I jT)rbe `b8L{&g~w*l5LO(]MH&y1b8fkE(R0%ybmU.A ٱ*ga-^;_ha$`*CPQF/g.;-gKRO?Jn0=Kf_}8=9UoH:y1m͇dz_MS9j4g_J;7/SCU0\jK UcXTJ0ܔs<>VJY>F\Y@O(х8jQ xmQ/&$q8qW4c[,]5dxtʭ{',BG_9bek4fElU{%[0!}1DA|Sؖ/6FbXa  ۩Wbߤ6%!~1J͈O˵sWP{vl4i{J*pslٕbmz&PԚedNBk7 c94!B3Ԅ(`MJ&q8J4W Jp8p -:0 "ӏm FDG="~JMEzNFlYU45hrP:mqP4HؚR>bcm dZǙ\Dz`" ^Fac 늩u3]Z%b~{\ڨ t'mMqΕY_(@D5Z-Tp~B1+x8L;8vxvU:p~QN>rj L~|Ǡv7~lrzwU S"=,x: 2E9qYdAi^Kڹ}"$BOܪ&Bj;[2>tqQ꩏ށ=ֈ7Rfܜ 24ؘ"V`d[P6 @QmV\jjl {m"M`f=L{&B$ӳ?>P֣~@y7utVw~K _ ,]Il=fvkF唜X-MB9۱EtfB6U4dw k^dwFY;Eit܆وXתsѮޘu\㔽9_;iöww ,ϣοh{]ـf̲ye>SJ:muJ eP*\\P!jS^sVFl]RoqkTB|#G ׆>䘃K`S5~MQ1"]<}V5fV k>{Le#ysM:]jf439l|&Z#0ء6 Gyr3;k\Ք Z*͹_ŕ=EڬdGV[`rYzཝӡ\c1IgxMKsOf)u04˶8Βjcgo\7n^°5V֋"b!zU`;&=l?wˏN~J[]䅵#gR0lESxu라 )NBc,$'ư`{.'\kjb2%zxX18l֊x3"2WJ.gecJι*Kݚgdj^~-sd<<35ȶGoS>,6VږT12Om:զU)!6iScy\5LC7:ܷzy$8e1X'/D$z*:!alL&5~$\2n%bOV8 :#-"x,i/|a=j>u([M䣨8YO&5-"6)"ӱ Z{pf165LKxoݼR/p'-KY깴bT͑42j@9ai*D-8KޥRd~oC p,ėdGT3{ X>z+Fk V>\yCY3PuaW=X'x12tߗ Cq^ߞ}x^=xx^>Lp|IKw$aRkRb+ Ur@dg!UMaɋWːwoL͚Fw\8vN5͟tH8ruocؓtw†}W,ߚ>_juש Z=XV g..ώϗ 8/?,1;:h{{hN.ϮDп.\7 ěžPɋLD]K9U DYPb{k@+*qtf OSn Fk#Q kY:Zq_'[wk=]b )r^|qKi噫{\ɶqn37鴺{JǝR/i"Ҏxh-}"۷ǠTsԚsR>4ViBdlr5RU1&ZJ.mxc46c6c6'166iA 䔔%DؠLLG}S-ЖQJ-X)KBLv(ιFA)X}nyǘbaUΦZ MDS ozi1*́=@ †QWNJiK~ԋӣ$-bmYi3<%9'c $|ܜAwT͍)IRm9 SD.m,Gռ59%6:lq$\ Wtz1')&8:Okc`6aB4'-#t&)Yjp)B*%ACX(Cu՗jm|.$*4+,I{mE"6*!E-C2)_UHllfCa b搃)ʬ 0tdxGmOU#oTxuf L%mT GhvN3)G K!÷(<4&NWp A^kN#8 o5p;oL7 ;<3+ԑ&2jQc)VXz%z{9(TJ7PK |Y_P1B4lt4&(۽XCi5m6T D10Vc, 9H:Tn3V3ᱺɵ @mS+8wQk "eR<2ge]"@ 2,2QZ(?zZB vQTDhP$ƢS\pkR>I5ؚ?cy!gѝp,FU6aa>ByQ TuyXFܫ=+ƂZziA %/&iI^J0C:Kj:z~<[`t v7ZWq`Qi#@(tČp(L(-th9EP!VQ i NtX#"5>dNk.L9m:lHnT;]|1 XvȎ5ExE#.BER#Ȱ=tĒ]'t\&!K#؈r^R@רvE{<170&`j?Ηz Zz4G4ڧ VucL%R,PDYrO+*UԪ/<6ʳٽߗ vmvST~BrO\ ʞ(`T@ZiX)>G%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@HJ))bJ~#RZH*$)>G%sR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)>_%Qڝ QX:J @R*R}J :@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *P%|#v:Ty>%/oNF T B6@HC9H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@zJ h.^aYjZZ.ϫ ׿@ci2[wߋtsy\$'Y+M0գ-F3^])m]g a/> ~@% IB z<߿T|/SR._& |՛\]yG/Ρ4_*G;O % rʖSA+(s'h8а9rVڼ<)N8=m _ݠсWR TW(fwjG_:.| "b.n| 7xvXc3HbtyYށj:hh׳] 1~enU`u7 *5==m@.<5j,UjUE*1_pv1,0'Ck|d\8`;E]rv4$qFDzFT|:[#^y6he៣˼hk>* Y~\oe>ʹ-3}ᒳ][ݺmHWiy.(7S}ʖ;1OFiij]oͰ[.;85h+.8Cs0~ F-OpjThwZEKGP)m˯oտ瓔ȴvmq<L KXil{s_z@[eU ~I񟅴?S{RM|{6 N+#w@-r}fgJ23d\3$M)B{)Aׂ{22{E_j,֧ܶ[]' o|U,s֤ZFs3~ܔE21^G+UujU4FG˴8ysy۲׶?Go%_W'0y!i wըiz-u~.7q͋_4NJ)`lJ9ݗ?pmy]qۉvGvFBۇ7_ ;.jM#IZX [')_իfTͰ+*]&44F_G[vjّiV4DU{WۭؤnOKqgqŧT8 hT9aU3Ʉw({MǒXڢ)'d8b冯6XbpKذBUf$MG߳n:`.Ŋr$1qJNy0j[rL'( _7 ?h0:ߎ&G>#<=Z͏E)zm4_e&)C)W sgwWޟUyRL0'#Oe⻵̅?p2XıΪt|Y'/''n˰Yg[bNr^^VL=ŽC2uQMIJM/O{uljZ8y+Zh@yY~V:z뺉[x~lv@\٤`nqDҾ~8.lz=i!pkƣ~V+5 uxzdl6c4va>f?)7aJRe4VdG4}+[Tzp޸?ۓ,߹e:-Nw펋^7jvUxw1 MU:3uH9ƫJLU'_ )Ym{bܝ#xYV}k7eVrt6.&g wMeJ#?AGi->.᭽+z~}<H).-/1YwWZ,Dx4@>[Xc4[/ $^H(x %Z+|SȬL8|#RH"Xc^Fuk .:޿Jk(糃{͠=qeˡ:=o1bRtk֥O髼A:бaB=*G9y5Q9.àCi|l%R9_oi76FaV)5>9d .+n#w616<ԙZL>VҬ-ܫk\Cֶ /YUOr'"(-*u9wmmJ~:y"l2X,3ؙ9/,Ɏew[V˖Rl%KŪb]k͐QTeu<򙺪P =D 0t E?0|4͡cJ{hyqO}GS9%UW^ь}0%%kUiċlҲ/)q4y膋A۬>` r,6\fYgh:#t`L;?CVpfތFߠ/vr:ͪ e` ԙp'zt,V.E]a S1^Lix:mx~`xչlt6)tPE3ǒ 0;6p݀ôGbp)ә9BoG!l,M7 d˻1rg[EFS2+mT-Ȋ`V5U2cTdӼ(JPAt}tN|DV/ /YoSIR#Y+I7VjW%g-S*%e\EN$ 9qy`Np.! `ndcU1|scq8+1KLLt }|5Ӣ@L~ǩ 0'Ơ;N(=8zMQWQt"p,kpȎ4Sњ6pgKbALgM~MQվ!>1[~ «UTܵ/diK\\%KksPu Rˢj, H$J)L)M" ST3P1KֱX`7NG|끫s%%غI9AC UV5QIcSkT p!I*D)yS͊D42Yd,ULC&\K*)!QIY:av_mVseY3wutJA-zsAíеWЊ:/D䂙4P#7dF)ek^ ȍL9-MSygړ^sZdVqʂa?Ȃi}^E7=q,n۝ N k{Aup,[o^x|KL5g$Lex/ 8cM8|x<4 0 >l݋0uZgН4 RwϱтMhp9jV8c4x̦q,31;4Sa^lhKfI?fåz*kk0.˴g1䘌f55:vxyX~x!c}w `tY؉0uZX.ys?```,Uұ …Ĕb>sƋ8ͳ7d`ϱ^~*0Eמ@:}72Fi#pL;vaXo=qAƖ޲i09adɧu]R5ZhlV?`lxiұ~M6#LeeİmVZ%c/7^: Ih~Ԡ-R "%H/Ob (C4NTNpdˀB.őR/Zh]?"M6z]ǯKhtZtE,֞{sΌ GM__->ysjB7o&g7y̓((H@ud߮>̒mȾ~خf/:dn]uX}xaΜT?Ǽa)c(||ǯww\9ogS[+ իZᱤWmLižVτȱ1f4&Xc38Fs_u6+~Ј{fKzE2kKk2D#+K"&M=;S`&ae[8*\sl? bxC:";2{RSj؆:d/j 1dE\8vɐ:ұB (3UOST~j=SóŇ Q=0|wdbqV7hY9vP)MџrldLqQڬoΙD)=qEo{X{35S!tⵗ2&eN Ѹ T [-E9+9mbWݱWz9LoX../NɊ6i`y x['@O6zФVH:=[Jo4"۷FTE7w!VD= Ӧ$Lg7Ą3"3^ j5n.Lfj 4L?Zu~&|s# T,#D*xPƛ6gzP!G8thOuhʀQF3<˳S@,fZa({4ͳR4uil\|T{v8_~Y̖I*YzS~b'YEi>O^!h+S5K^o_!>x->b_?J)ʥ?iؒo$ŗswE0kn=;^0y#a;hZV;9Q%৫:Z0яZAPd, W%MmmlUS;}? a"v]: $,.|M)cExg4/t2Lngۖ}Z\">=iDS)3a@ٮ}Pع;+2s)z^o H2^R<+2kp_zfhTBq؎(S8tj=GP+ 6VV165c5m0U9nXd}>v.tkT&ԉĞz(ei\|aLjj/}uF R ko|6m",SW%UµaewLlyķŧ, A`sO\xa Sajtdoq#=f6q74JNҳ]|' ݅n-*_'UCWB (pD9N4=3b՞sbS /!EUhgI)tN0H] dL0{r>ҩLpfz:=(X&s;4dtWK/A$k_T[6Txp &^ӑ})lTc+HU8, Q?]w*'/Q[< RL-@]2`њ(=. R(!̀>Tj;0.";U8X h)"=qup1HF{C U>v|2頰RKFLj>W{< wKT?‘Rxkl]+gуm:u%jͤ!7꥽@K`Wtb巳ՉwUI*m-o~;uԆɗيk n@Ѭu\>X|~CJJiDdknl$.pN_춺K%jԳ.>^m믗$Pug3M}1HP|08;v\8-o[ $>4 . :@PCkHcJJHo8g0z|C=8> *.a &7Nns)Fޏ(7<9zvO}@+S&N'ȁpͺ G:xN,48,p7 ]E56wE[( ϲ4<)ٺT3,SP朇+AJFj7%zSr2 Fc@L c5}5V8DCY/oRfFeHRA hgs]͒l`Z_IE401z,lq %is,ۿx _lD:WIbAsQ6"jAH;]&s`n篽O$"޳Qa̿2]3mo7_$Wu9ARa)s<',hAj^NVKmD$.d{rSjx;3XIX/@dgA'"NBE>y LeJVZ1AB†qRT쬇;W*/3EI|yXH;zv.[FZs<6 ]Hx~=u|hB%2e8v(=/Pc~}eL86P+$I%bh 8Lm UN5cL #{U)!VfRnd {\4t(9/0O&w~LǐHX͎kE!JȮB-墡\Y›պZ.~/*Pb}U~}C~nEv^46CyWkqus[O2/~+Z׋b[l^_-/w͓eAVВZ0':)IE R݇T[/+Iw5.Z}ݠ -kAfud w֦XugT+Ij!1` S$@vcSI/9IxSQ1?֦ѸT)ʼD4'0A j XBg tk*YBl 4P}0uo/&9DDV]BH ˗B 57…psdS~RnC޹I KEHbH+HġV PؠĢi[-xhвEG "#,8uȗE AH5!n;Ȏ1E9a!"zM=º^9m5vHLy q0 Ø! #€Uv p^ъH+WM1t8z+Y!cz*ڳujU>_quO z/4#Ԅ mA\D 8(\XaYzDE}٩'@fpJ5hU) &re }N1S!c&/x &@:1jIxrΒ qH6BJ!r-sA {!|ɱPAUCJ{Q-'# v`c)EéqmP$&Pj%I4E~qx]$R| @Cr-Pk])Jb'}3eAUI ]lU|\OV/ p I„5݋TR2`63[_ bVʒTjɞLV<^TИ m"g!H8'Ͳo%Fl dca?/JoƣeNk)@pqhp|Rg tI'@em Y|DRݔa#rQk(?-}GlBA3}pXMvˤ`쳑0T>p?jmcyY.?ضܧty?ҧgDIcB2D; }Mxe^X5*!LNR;)=2_їOnBkzM:R%Mۃ)eYP+&JhW߲FbԱM..'BÍuj]F{/[&9s#mƯP9Y@Ս^-+M SmFJCkwZxv4_lW[_~gZG:DF@c\[?oufU˝ёEjh+w;qIB}2"T.ͭ;=hK] U!)n_7(UKXɰ{ʿw`eFd'.ݏ|qwlawnO^@#׼xzno/6˗*'F/]`m_pHOps.ЙB&Ͷe Q~.,%o5)n0z/pa!m~>;М^`ѣ0B$rFmBNhiQ,Ϗ>-v/TⴼrUT3~Iw$:17/y 0?n$]?ݭ<`?~wQVh--aW) H0t*2M;llӨT"ī~oivl4EZge? .0(aZ4Bbm~!$ rwmMUFldv׾IT%z֯oM w~č,0gޛh>U&B+جF ô mGJݎkt35w骦Ph1`hll_&pH묱CaDdcF=y<\N'*3M6CDO&4.:kQfjs!62ô_&H,Nz>֨VNGSU_FN/7SJ^ ?r緬'DZ)8 f]5WE0Z暲jr]6]JȆˌN1]*D]sQD9jB9C{c. UiPʗ5_X}^a~6ar!wi I4W$!:Z+Yuh\Y~5Oō]=R'~=GT~2%#Tv3"^.Z1EMٺHFS2 GvLꧮ!Bf]"Yx?iD|3"RPapOoꕾ6 CQ. _g'{ry)c8KѶmTs_=LGNF*Y7诘 cAVOfi ½1!6]9-' xа|][!c1 TEPBvcC:,.1;*zҋ<" ^&j* j"ܛFPdb BB$1,LA0s!,R J g5CNr-fH~/SSbFbFL)/T3Py /NU,~-G?<ۜeIݪ]/*6ˬ!wV!og~EOo/ftO-[3bPVʮ䝹ʟ.P|ێͱY4ÝP\^$^ĞQy Kή+h ,o>g~\..deAݙmRE`h4Xpt;fUfUx-gg@v6Q'^EMgG|ZͨN*l0)J= B ;I;UF$#J_jPu9~ر{ZVp7ꫬҤg[!Ww :/IEhjSjW{Z-k'n2&-74gujHβ'jLI*_m7;6Z'6aҧ)'pOjީ4Ft^;X,0 {I LŒ6ٟ<T%dJQrlm)/Рj}6eB$- L.3ՈBjt*dxV#pgEfH`y@ @ ,X8Mi?(/c;i#xd1z,Y L,ӜEz; Bۍ1uVȘQA44T"\~BleDo\jw9-64kax j3Qқĭl:'gL(f}jg|-1/84l ]sC=#h4>b6ɾkF:硲@B ^F񧑸DI::7U, Όr<$N8 $ނ\7xb;/-e&m~7P<80,nr`~71lPp6s-58$K9zbO 8nYW:9v&'PR2d "un: ) ixR$J2NEm7Hf\*hjS#]΃$HǾڶ<]&X]95Gdjugwno#r ;g5 <,L$ ~Z2Lfppic~Iة3%CYY601EP}SGv_fU>}sPP;bNk *dxzX+[+&U9@ iZBU IngاŎwRknk :HždO1{N t*dL4@-p\Ys8XyT#M^5Z?&'DnB}ihՎ2 zzDuN1W}Qji%7I]~S]ȂVvuȭ%WÚziٻFU֨cEx#6KDR#}.Z Da+cE>IY'r=l6>]W]emoq /\я~Zc2 Aw5X7Tvؕ:57 8ڟlf^jG[/B%\ 7op6<(:tj!TPF=03}(=f/nލ=[d758VEfw|XK߀q+a;a|ﳯN;(sl{&cD+={kLP9d;B')_$PT"\| _ 1s߿j)Œ0W 6 ~.Z~گ$XZ*#~}SG۫e!A""@#+F \*FZn][T Úp6;jLhOrELG ;9LsJuWM wr|Vl؋/78iP,Yj F>'Cy9ODn!1݄zQxG7f}$ 1Jz{T2uߑZ:2vRAsa)րyH1-†!6(L"N (BbTrspq㫩G,efQDn|0.i1eC .M׽]bq-CjRt‘Z*$'t'fhwUkW|sL@it:+Ր{ӣ^۴@-Ygl/-)(C(cGaUQI?=iۇY>bA:cPA~߀c+6Ӿ+SF&r,?bF6S&g}D|`C^Q) Pp_{mOmJc~{qmIP?FIr}=4֭ˍ+':-eDPqNrEk 1rJt䬗>Ñ "u1MA-wZ(qteA*~)ED%SBj.}LfR.q/rY;:z,@4=PKt.#F@M}|`!,xhQ-өZ~iX\*Ѥx)vŢp^$t|0WyOK_AP_U~ 8/܄\( B?(pYlՋIA iAgI) pf/Z/&=:x$s@c$f\ʼ.WLiAs [;IPN8 VUxd^ eŒ@ UĮx=Pܹ̬F7P4P5 (1DntSC7(3VAkSਃE3|2Qzmba j,`)~}pwL9%JAK彘MS)WиupԬ԰cڇI%ƪlA0vh5G( ow|L]ߚsTsmn2_7TЎBo a(- ~!~$rI〽@# U@7wW3A0AyZdcN]Yl { [t0d4RF(OgLcu5\'vdxX÷TygY4b ОUETV(N8(Q.f]b'7̔AtUwDէ߱%ak}3~/wHzQ!@s0r.c{a0YnQKAc RDzI!A<4@.?8TtVC =0\iEàG-e'rIc7MP\Rûr] JD,F DWN9Y*azތfy;f-Wvw-lw?p|ޙ)nD u |]r ( /_?'о+$;燿S ~d_-,?e~-&dlUn$lbT Э]̀TI %&4!'h%L Re^KbYP3`L7mE܆RYVnG$;ۓO@G?Q?Kz&0 .G:BLXlD`aw񵽲=EN*Hv8R4e3355&*h\rYYIadd!ǜֲ+?ܤTPJ+)P.ܨP8!Dd?il{ lboN}e9C=|.D<Ù@&oon^rUD/C_B6\7h^}Y˙բ,DG_(řPos>$&u4؃X쬞xIWи@)$+Ěu F *t80#=g*u^XC9J=&T(;/)[luLː W܅R&%Ǜtxw:TIIKMTo%)wo{PRʤ QJ)o?`80O0AGF9A}\Pv /@`Q9j|rZ%fiXއ)MtY) 9Q1LX׮٧s֍=&_c&E<B--Z^E[ދJ,Φj¥8{*V(kNߴ8ڞp^KE-WA kB-3ҳjmʺU$nf|`RK_pEURPr\z鳸HF޳bDX&$hEa0M ,׼P Ĝ/ vӬ4e>Ý˺h]d\E=}F=cp(y|ݵң1 F"ivEHe_-$hL ō1sL+*h\8 kNj,!u&˧90B2-c ,?qIcǕ@K{~VIMHLƣt*Nq >ubdsDŔW8ai_B43}6;<4 COx IoV'\;An4I%z+eAqYPc.eWL=)r2:.frݒFifjѲDfT4\S{}R3k*5$+h(}4ث^1(%V0KPIJ{SK 3FGkO*h\dX>ztU==TYhEhI$r.x \D Y.uP^AU Oe[uYqq1UIao./g|]8(ǿ~4? ́Pk!,*Z8tE8wmM#K(31+=q"NNLO<%t)7,L,ۀ.!1,w*+/_f·& "BD§# 3&\OO'ecHf9ϿpG|:? ?6֭zv^lmjo 6fck|DS>ߴL c 4~ʇCŵOt[0M6G, 'Aqa)frvf>}jPN}yެ^ bsQ?J2!T5I ߖ<ԶH*ѲC ٛCHOFqE! cE:I~EeHd+rفīǛ4Ş+?e[#x3H3=i}YB1fm0N P[aX1 "0:0D"3D∆ǯ|'UKo󖳱+h&e=cMgr.4M^Or>ۙy˗ǣJ݌Fb|Nv+D. Ȕs3Yn1A%pğ6 GL7m^/ZdCfVqpWӝN&=6ZU|Q%ZLfjxRk\|P1D)gXA-r%/EZ}mVj;P%I hs/#S,5_OMϟAOo9O4 R uej{SyɌ',Vn5-Nḅ̞T=HS@_(l2EBܨpֽ}SOv1 =/p0yXU7\yS+c49$yTgcRI,1=~s'3uZw5' 7w?GٶZ^L*=֣ E`*8´m؇[ 9s$9m]}y]_gf1ՂC.^$k[Cu0Db^ѭVzK}tx A_' &iMQ.ı[Ǩ~T SY/5Di$alM.btPx= WL׊DtKyǥ-B-[}ڌVs{x\IzЄKVIۆKm̧#sN&iԜe<\<ϮߞFT㧗]oe1M~ "-_?~,v!cp|5oxd*+7\eg.3d|'9Ka_")ȟt5@2N`$ְ,&Qy,h L1i8M1luPfgu'诓/+Ȇ _`ER4NEO8i``G<0mLs\*CDϒf;8Tej2,j) EQ=4*3{SjbAQPe9펦,43 H)BR$o_WVreOo2L&bPg3IMD{OoMJX't{&#o}P5(r*RiF t^U݅CnՌ 玹zVCD]J]N_6a.!v=v۾M\0XD9P;X#e돪vp_;xl|^2}[CirUAkdiEоݮ"Q9å]\ƋdYTܶFWJG#@ûej =U`%j0Zҫ ,,2[sywV;mRRg"Q: c?%`χ6r &%S_^Bqz3L5l8BID!8WB"Ң;.\wKI$ݧDL&$3CK3#PXAAjGpI>]c˶"LG%BBA6JMxyI1y^Ab5&1b_q|p$9UwH<ڶ(2#7~M;ث4͕홁K$FHHǂD,l`a[ܷkƳ\4>\DQHж5Eb'Pư:⚡AB uN׺b`%JQ]ǯʹmXȘk :a&d,u#TG[+8"W RgZD 'g Pyc |׃RwL]ލY3. 0OC!Fʡ:AIɰƣwҚ[Iao A`@vש7aVmBbMG+2f?\)hx6 ^vwt Z]7w7&5$ d|DPLH#fzhEcCp$ .;9&4h]Eْ?;09ok""c4!S* c<'Jv:KC+O0)f()a5'1\?G:.4l?LͮqtGlN28c*$b0qt=a ϲ}'] EsXs7ue(b p?1,Y b x=%҈F2F c8%Đ YaY4NҌhm\P%e•Jb"eXFܹLSB4%{tP;'%N"׸ ,}HggH&)Fg"8SLF FXDF(Rk!IŘa#i2D4Ȕíb="cJ/]MR|{[o(#]G=jǰ/k~p>@uŠRx7FTS^dǻ@S|Žь]EczVV}qB49`K bi*X+O2e{0 ގh`pjzCd \I#,Ӓ;NL}Ǐ'+jJk/eb9韃~ hX.?ρ?=x.cx9*"x$, `%ӔrIibHdI,BVA#-T_M֐#ց5l*,6-9#dil~O A 1'OoY79G`3=[¥h%XG,+2}ʹ\UuUyX {yM0)f7`&{pQcGH(p QP$b7d79l ~?m OB^n/O[ G nAV9~19ܐ|3S. 8c=8]h6޵u"1RbI5@h5Re.3CX?X"ra[k25RMF<XJ08؞mfy9Gžaktep:UՔbܢbn?~N1|)S 9E<.}vKPn*zL|H+eu2Q]ezŘGA ^L}!Ks(˻ A2=NEGtǷW]E}*amd|t>9))4) PRgOPKz9 dlR5چzQK(ʲLk#'\!2b0j͉16SL0V*O(ˈ[ e>yo @#GQe FݗǚC^mo{sq[FZh0fJ2; ~)eDF #8v&Jh&^r@sX*K ɱXfC4(u@CPr!B4q%(J$t[ff(!1t| JUs~!̅[qc}WKA|G9#r3 .aYh= ]-7ی$VѶ}BNe˶?|/_ E4E%fjKf+&ϺS1"<%nurŮ'ΡHU8K@N%2/l/0Y+jOCzVuAy;!`zӜc0Z"`}dأ0Ѝ\r;d[Gl5턢Qc栏m/ߏ5%L8,9]F)cW :( c)Q!@%I,J;o;PW'upDCI!}w%z(cD2R2)ZxIabRL ՘HSaFe"l?mL2?@B#SS z:Yz>|rFdYb_~zv͙5>0y `47x~к*[bMЌ+dBfN`$3,s&1\#^xGC1[Tb  C\-KsXvejpnn(kEVl/qO<\8ŋx:yA1(0_RfsClf՚~kK!́v#W.?1ThzK'6Ql+͈ݮ( ߧULG9KM3ea3'=imZ1aj][78ۈ3)垅 G6'8ԺmL⃞0 ɦ!:RbC19i1Wfu}[v2iUnmq޹rl( τ8:>xֻm@T{b#Oz=5z<U)cot`'RE?Kϟn4w^^ճWVt᫗,߼)?ۆی(sfMl/?mLd^n޽Mc;˫·gGuW|K~/\sqC^2u^d]UTLGdw_쮿>eW?9K2;o./_ŋ `|8DQ/.z~7? y7} !ya5۫Ct7qeX8=noFtXNWtYh{N|Շz.'؎>$ݙ?[y3UnA;܃A2  LEHƂ4}xI36cxfwx$ arwOSs2!4.KjaR8gI MM]p ԓ<N pg'h6چK|o~A"%C7aE!#/֑~ٻ<8 yS(Gj$;MH^>VnÝ-$jċqӿyu/itnhċvVcg?"/bq}~iK{ٛBf~{f{n&`,OϷ0Weoyr]4.9 Zgn׳'Zյ?aE<9Fz ?~+߆P`SXֿA9;]o #&HѠ>#׋!xǎRD"zS7o%Tyc}'n^F x7aߵBD&$PnI-$(eMZ?`]O{j)aLkNE1Y΅1G.T^\ hP|B5n]%;H ˊpō>$GBO8`,Hc)s<$|PI e*-nep"i^'~>[O ^@cŹm8>6:L1TdȀP;0g6ZaqiM5uimlX&@vsp'tmĐ ;I%;t3NF%We)Yyl=]ǤZ 8pMy$[E1́X%IJb$ŪݤY0qaa `{顭d$${=6с&&Z~!ط>6>=Dm&n[KI/%žRžxy~ΦKs8k\7󕴷hW5>ko+="k%%R 129J()L-3$faaE;6y2x林a7lVܹΘgLG l,6Nz!={zX.8#WwE%KI%;n9si%Sx_Sfxԍ [1n;wÁz^S86xtʣb[ĘXBs[k #Vf;dšMJ (1w `Kil/*gbkf] z\ɦtX[HyHXd?yFո:':}AŽp485R靴.%{c%hӁCPF.̒1#n࡮a=j6ݮJ8r?*BlEG^\ ռ\V)_k@R vARnd,d.SOd&_`/HR^Cuc^&Sa1;T ćd3n>H%۫iJ[Ut^;JNA*R"f9<*8|\ڹkBM2q}E^Ի-0>vےfvAhWS QmVtțmܬRc[|rA-Cɴ 3l",5֍E_t KQO{Y7Np_Ͽq)ߝ%[\ESf]U(8ՠRnܒUJkgR^Spl1NG@ *jTt_޵^$:b<ԥ=Q1K<=i_2KtǗW s c#ambC=+Z4P{0ug: /RR%=k'v E,V)%Vi:`LZVJrŵ4&ϲ_`Ps5Sp27PƺIX>y9Otɘ3Iח;" /XV_] f KAXd,Z Ra^00qk}΀'þW& oHHC#jb}_t]{}پlg7@.@HU?B+70sg!ɧsM0$PwN(f l)BͤGKR?fƌrʱ*J*B^"[6gNw.mIMY%bfl)&H2SHkQPfTnlyY5#$Kx-hJ『8I5>Yd9Ykf1wR! U9fZ3k} GX/</ml|Qx [Cn|"]:le:yS1u~lKI?ٛ[ ێ؊zSA|t6z. =%R?9 }۔@>)6v;+; k)O:R (܈Wrw҇Wreh["G3EK+;Zͦm"ٱ֮nbSnbҮڭn}K}+¡-KNuzO0Nޱ]7D3显bvS6Vhwthwrtm9YۀM9gBRb@CBh[vѵSXeBh ȁ[ecy

gHΘ'JjFW&9|#x{HK8h)v!>]u@d7dZ<Ï?>}riMfN/}O~Ɖh( i~ʻ/2; fWwH(g$rp+kخ_: J 5yװjĤۏ??O[دPsE Ee< $9]N9Pt{{}ڷ'/4s_n3xuT/N߾˺A0zųA_*z e\7XTf3.]bOHl6M&te ݆M@\3DV RMs]ȯ/"ZO5JگAXWĐiOb5i Ij/6M_ g|,Mo{j6VjS#/ݍ)j0cP#ZϼRj<2D͝;j+]zn3 0Ty8,XKIq(A`Ikť|Xfկg?Jw-_lh쐪#NlܽnPͤ%G|?d)#- *rnB#^OTA" *YOX_ uFuh컵deEGaa]MC%^w8դ1v'p V&׵Lh$[!ӎ-Jũ\FRV|J4ap)jԴ)X7Ͳo/a֣Ij~>ޅy i/ۣfmi-~U6?Ϋ>0 SӜښNѐ34g,0"uCi> {n%o<6 4\X1Xl"7V4Òʋ$,Eߊ\S)2PA(W3o\cpfÆs9TҦ$yG`/6b{DԸnV #6Q;エU)%P_^ӆtxW"]KFgpݛݔ%om}`wK嘯!Sk):'G=.8{h'I{fHa5ͣvDچ1*J߯B#dJ I^8e2}ֹM{f?*Yڰh(=qo[ij'`>PBr쐑+Нw;qz;҃2;6 ,ʳIA< փ OrV/g^_ ѷ_(JU5Q?Bl67&z垂W Ufu㎺hiFe;R[Ѫ{/ѦV p9/C6lpnCSwI)&8mLA+KkAݣ8wb}~؏tBӓFX]Q|N`rM|x^mZփ\tꍽ4o#6gZr1g|ĜHe-xNv9/ow=O3έɧ:]cO^,َS΂$'-Tno^nΝSؚ:9xz/mJ:)-Ќ099Y^'I#/.b 4ޝ桁bN&S%YGv'kl9ź|ueLS j'EMD#P uElMe΅ޞ"Q >dEpb)h;4\DVl< ɣH#N=do]_?Be'`Bӧ;8(}'p>Ƶ H5 #gM `7+:*SjJlu3KYd6'mbȷ忱Zd)>kbcG4A:[~" Q]cz%<@VkU\\"35ۓŁ(2;J`̢E1 o\S>/ V&dc`H[3Q(JUx#'bknmx3Wz5ڷoOd#V|f3ƐR^Ȟ–K]n[sV PSt :9$VxFʵy{bgc{Y[9}Y ӎeK,z0w+Y#0qAIq榢3Z5`́_kuюKT  *Ug[ Tiom=T:U7p'rYOb0dr<4z9}?~6_E;'߿_R]MWυJTBY}Ul,^_._fw8\ބ޼V,?i1x8ޘ߶ҽ1-̠HuJSeU^6їCgA3vAk*c ֯k St8>yWtq6`rֽbF_RrsOFxW.7ֆN$|"akVCX(xtq^7$hj/{K=)WI+{ୃ^lVi5'3؋g7ޣ$t0;-)9pW1{3ǰz*n{V֜ wEAxmpZPrsO]=ܯCr!}0S,ؗݗQ wCzm$Y 6/%AJufLR ;ஏ=RSJr꓿p'q/m~?>:z2d?.(QKz&/'̘G3:A2Yuro^@`+_x H\%oO}GY볙қi."^X~R7MM(Z3ӯ~Mү&d\3wȗ(GŦSKT, sIR %1gFr+ۤ[ 4w>?$\< ]/?tlEmB=@ءKNF1$n[gF/3]y-aOBهEBJ,V/u\@yɗŽgakӋ#ǵDr\P UBnI!LD~4~VDȗ9m]6a>j* {:j+6A~7e[!rxDg?;⇼X0p,)9g@;PW4[n} nȬo/vcwŚQǎB0A/ 9'`1^7j\F$&vrbp@hr,&b곖hA̳Y(W:lo^bέНfYާY/^VB^wחZfH@*!k* T6jE9[`Mt>ɛH]o f͢dH'@]9 x@jCR^D7/qJH*(~ @=H NxtH>J9s|rOÇG7?5ҥUm*rd--G`8$9-u1p[r^G/{DA[%Sv c(FG;ٜ+cTؓ͜#'t3s(GODN]^1DthJD֫rʔZ[CDRclEYӦROxXt}."\hh O鸘wS睪>T;݇c'stD:s'5*7s?,($Lx_2An )be"Sy/DNu+|176r CWLI" CKy=GA*FBS:kT8UV|OXbkO>TL'xm}t:sBZ@nCRgZQQ 1T$+1י2)ϻ,h*BQK-,Wۇۙ3Lpqlu ǻz&wObzܶb{ӆ?rz[AdյIzZl^ٲޘ@)>|TdL;k< ouwӏeǹ>\z[}>W[U0)c)#z6FGDߑ oʾ˜Hܶ&Q;?O+oj==dj 䡺ڜz,!Q)}26ɟ0 Y%z%Ƨ*NR:א .eν6NɬarpRQ^H#Ή!⎳^nG.Ϳ_CzVlZgtt[149}H85U5ut.Gv4tg /'a.B]9zJ#e 2$s,aN3&R#l*PPSwQbUJEFPo?5_h7ݫSdg{YW_G/3&ǘ,K@9LbhFq߰wok;SO}wgH]jii,e֌/T@jAVd,_2qiaTg> R+0ۂMCJHB+vblP+`lתwvTXv[iU)BZp]jp*̐ S"2O܀pTv;[tGIBFS}KZV&˨umv5o[L\fzV RTԤI#3Χi JPKeattf5,ݭ~? ;҃ޱ^ףQFmCU^ um4X3A(2 7$N>[j!_{\B5V6oX1\@hI.j6GI8=La\1#Wn rHmCS w6|^-osrդmXmUc2JKsI6I^0ыImkcOIN0ϓudB"fUv*d=c[-@fszd33 ŘQ Ūy^˚Ya_jeQ1¯1ɷcW`IѼ(հa=ȱZ]cQ4/uz^l&uEŃJ~/쮣Z[z0N%F:-|gIk<+?mtG9LH8'(v'& cqCٸ upH!p X~"h2Ǎqюo3qAjDsTyF6ȳP0-j"Ffw .n,W̄2 ihcHMؽpbGjDɎakI<.8"[ }JSՑaN0 [aɅ.R|ӡ`,hҩquC !gPf&7湤EH_.ߑ-^"@hkч?4ZA3bPv%LwX2Nm닺mq۲ MIS"#sPF][ zY(N4JiYgZJHVRDRCc 3v!fe@)ZH}mk :'Ka™ ӆ5!O @; _   `[0i 1gLDjw .%H`NTBh(7"Ԯ ,HmSzt V))NHIy!岱 dy #;B>*;"< /4D>W>*;"Xk(>AK~kVFvLtt#x# FHAZMq Fbj@j"i ',3 c i3;R#=&fi͚V̓i 1A@vk&^<jGZHJ0p`!^nKvBX1Q;3!ѴbF.b5HB5XjIY% :z> QUpN}(W_ j=z0-)2zx׳zxX\cYHwA/W4-Z9TBsn2"1_^ZW;-?s&V]n tѷ5gE3۴\qFx9 596/ii0"koX}7&7wuLjӻ|Wh^Sb:3HRH/ߺB^yW 94(] Q/ (>EKPvcǨ6e9XW IL-dRႍ,gwW2%^HjrqS,RI@/ujC2 ˱-~&YM`=nI B(wj*l|XE#El?" )BXsiu0khHAI#X܆oJZk WTKk6K_:pKv09+ }fr5H612OhԤgUJ:s-@+!N(/$0F&i+N4 ) q vy7^&H9fiR&AZK?4~Ԝ>H#mi "TPFcïE;0Ј&R:Jֽmwnz%p@|Ny NLP6?YzJ%Oy:3æ$"bC2N<&^rʌ"r$G7VxCT_Y}<иKԣyrНRHBb߾Kr_iV hu2|>Z#ܺ̿aztsQ̀Yӳ~mբǪuF4xM7׳_>$!Li=Y^_eG/&u&E=<Ҹ]N?5"k>Uc(O -3umMٯZٸR_g}X/TrFGwo &1c2}r6;?O+oj=[ qH-R2UY_x1T܊Ba\iN BcG=n%Exn^(277 ^HjrqS,RIRB/ujC9fܫׂ%P%%m= Z-z y6hP,$>ϥEXx 9/2 ImS^vt>7<!8][oI+^s]7V a`'dvfF-V,[]&jvj-u.QecE~$dȇ?mK<>4qAۄ4)Xb)E *G)G NI&@[Ll䘣I:?<):qL#M} RTˑ4e9z{d8'_P2{k2mֱ5JBr]x}"qF`q,KH6\GFZB{%/s˟{ud h@ cPLy_r!4!+duE) PP rUjf@\z4r(RbEZKCsdvE ?}z7t4U/rӯIw|[\Z,<ͮnEs4*bcv_QSUFkPEmvX1{RdxwKr{ZAM5U{F$Y#ptr3pMC(j@7mVuFKh>PaJE*y{thϦ41Y{ v ђ h@廩9l/̇/cߕ#^vWtd$wI$)cBE>}폯<7sXm1Z!11J%o4Z=5t#RՀ(i ,N41i -`)iFlVpvUzX}qI.!8i26hxeR4q]vJP[Ik&K~:^cQ*gi "$^i55Y2d+]nۦM!ځVZ9lͷ+[_u ȶ+'+'Ե6nhF.a@ѯ 5 kցCW(ٸ} 2?ΏpҗG}Vfv;;Dl&L(NX1sƨ{^ ӄkط~l&yVgY '"56TNT[iIh" qliŽ*%J ,h|7Rp;I |u--BߢЪZVqhncV6$e9<~G^ѵt:>*5DצlJ_w Q!Q8Fmwh4~fFأsbc˧\2T`M{{ײg4AM>fws:e!yMLfdD47.Nj!Y>h@a/o uŦE~|Q&'֖ ηyUr% huW%+XLhreK=ڽ*فyޓa[wubc^JԗbT˲yfs\BlJ*EVq]Jjg(aY={Yo.iD-J;1{Rt6 (NHٕA;S00cÉlPޟ+܅Bal<¢㸐^j=]>%@C\Qż*fh B/JdmK6/d:p[Qocz>C8+,BK+J?^|:!'{BF<׋ߢcܐ]Oů&2G\g>nMH6~_' <7:sʓjATQ8Ew_GdI-'W&,\#gϑsL#%2Pr ,,)cb~vX/ل^q[Oo]ٜw(8!m'iSCsq,RZjIV":H0mn݅)va چ AuY-pj^;$XpCfcq y}߭D+qp].1t|P#T/Պφp]a CV*(9\FJ_q0 p\5Bּ+L*n,W9U]OZ0#lDkh<۵1 hRJX~׿#ҋM\iͧkBehOJp&i[L{z5U@ tuъ[Φ.1p"%mICŠ.FYPSP:|a|vO$b62ſ{w^E{,>tOޒ_@?A\Y OLbRf`&LN9L4`#L#6 Q:a茤ęe[Ե!a'Eߗ/ Zq|[q!ס Yp!$;jK!Y;$$2Bpޡ`T;$#H&_`|v/\hadH&_2Y״[1 ݲ505hA CZ۴5pkCZ a/]SX2{&5(}+Sض]S@6)EBif7kGp=׮&p\;ZV *(kwۣk Jg: Oh!G')aJ%#-b+B2ƗͳCXݍ-Sڄ5Vͭ'v(JY(##*‚~}pܮХXP9L4OÇ C4s'&ȬL<Bl:M/z#Wz:qv*9edNܯD <&>-4Q+S2L7xW69 `ђ]?V+hy$R.E LrI9"9*%(WTN(a4{]d*Qe!#ŵb2:u-%Os[*enYHI'gcyi;OY(܍/ m\ttoJ[VSû'{JQ:tBù߹koKnӉ,~0w?:yGvhBݿv8"GwxS2턶fFrE{F>h8ruDJnȂ@e2%1kl; Bt_n0R̜KJ}%uѿcBg^ ;SNJ6܉ ' նهQGhڗ8DZ<X5m#GEЗM|g@0~8xM,k$%be%Z@H-ŧ*O=_'6kN~\;)7&\Q_m,6b,6c!ϯ{hgq\)y ԆBᔆ5 JF;8c B 8'#9cwk-kז2c(t.w]{XA^7.tl1 6E06SB@&l /PZU֕ VXH(" P}9!^XL`$j`פ By6t#KVŏ.bX@GuˑbU |쏨I:MM_,PeEl-!* W%BDK!( n4(J$JYC.$cNZ Co=UNvP @پvP?>ꖗ\A[37AZNMcj'f ~oz*^o0)iLj0-_\D^vbn,jTF4*uPY1\\βb0u})CoтgI<dn×ɝRm&n)n4A4A,r=Yk|?].m'_%QdScwgX~-Zfg[-co;?_"X~gt8?O8Uֻ൱@,eBsD#27&{}벳 R*"S*/%v.2R"(0*y_<+6:es+- E ? J;PAG脤z<5L(AAH Sork KiIEypicxʤfwڳYEEAm%ZFT6PbQ8H\em )t PBW *㨐.H \RtNPwRMjExB0(h}xk4Tpf2^X2o8W1ۼu͟jprK~۱_}x}@Urx_}\WT+Xkp0`->~K__7C;ܯk9Q]h/nS+ҔP) oێy15xh>j5 . v7`{3 p7V;6F/i]M͎} 4vᣬ":4*o,ߊ%t/I ͅVB~P?7}\s{}3t6L7>pF_qϼ:X묌G"[4 c<~q߿݄ zkH-0p`<]¨{g^- ͪpi15Q Dޖ0c <Λ)9vG~ߧS\ Ib=_#[/I ۵,^h/pz}/Jž`V !Bq2S23E1RZ2S%RL 'r *2S/3u$*~!RFBoEȿpkۙŽZa߂(V8FH=a fh@Ͼk`AKE[23\:sRGr홃6UYowN B >fC4jSms>mJAR~&hD)ث/[SsAYⳋ++ŦTMFf^w?t׉шMthؚA2Kr-{?EZ=҂bT)R qTHCՍwjߝiAX'f#iɋ9/tbl6 \U-1hfyu`g".pI!1B7ZƆ89o7|{llHlwi(Jy4UTU]m//.l02:71e1f""P`p}P9"Cӹsi+מw!RC -8Ĩ%XІ4t㸦ZZN2 )`6(z#pXA5hJ 㩈o @Ł*bGTb @֠=*-PAL+Uf@)1J5<H T@K'$~I]n +QX]DlH٪3Ψ& ľIITv 6JsLN"`TD*ALeUz(aDdr%~N\,*J.A&fN=_:C1rC )XP9# %hS@HӁlT,vcԇ[ן''W}4Um&|$>,K#FWb I0,_/ B+PPU '$dPk!E,[&_;U U{`uiT QbqG붎?Ћ֣'t@y#n7Œ񍨶 ߚP}W~Y7 QzcO{ӀhgzU_ٯ/˞li>~&NݼQ>J^>+ӛ}mʄh{;1 hn *V#T̛X< Kŕ"f"91d9 | Ap?'P0d\N"dZ\[gd%S0#VŨaLLxl֥$SBds(\e1&ޮ5s,I0+/gXoX^X؋A>=]?c)icȞ{$l+ud  jv$$w*#-Xbfo1?T_~t ]s/Nz? ~}rr҅I1EC5vtXW(Ti{661w}\J2WRmO |oWP.MVFufڴ{|jflf{:_)1/;+ fȇ^Y6` 32w#%(R2 /@ʐzgmp”$-nŢ FI{$eEIQZ#QCr3aTYB&Nj+k, .!` V*xnn@V"NNSXH Ǐ~o;pcX"Im`a9O#gM~K+6Oq>m@V]'Ώ&Hpԃ!﷟.%LM(<]9]FB emŊn\; RZʪʕ&p5ː5ƌoScBb^Ka[} Q^)ƅZ_Vq2ߋG-=a\"5'O Ko<]~LMJԸ1SPk˅W[e$Ly],hxE$u 9v[] 'xP9tF952hr0sXiWeE"MMFU|Zzld.d\7JT2`e Nb* @Kt$ÉR+/l#0;Jr2fPK!vNS$\jY o5\+ppmڻRvCܝn_,t.YDȕ[ޗ[Cn鳞`+I:x>qFUp !y:ïƶOΖzBVg7OJWڌkb(r dgO~ z͏UDΨ(T_|%&M]~]Mae{\^n^Mi V`{ ɗkA5W"0b4;goze[Zװ*p蟯5N)얀*_#LZt0 ܖ0Edټp`ٿR\Qrx[/n邷=lR at3A@\\ޙ1 |V0EΦQ%Go ' !>GDQ١NPûAhqPX3e2g[O_ TiL M@EJ]ŕWX+'w[tܹMl_݉^^wgV3ݴ;n2g‹N&`ĕ)7z 9 ttd0,XbA'xfV>1Wؾ55G `Tnêkqíl.7\I%"459%JnH:w銲P/Bo4\z qV0)oH=@L&pN[:tCKnh -Ֆܨ7L7ZLZjP"+9J 4YD- (Ⱊ[vZ/S mfEr$i;QH*ΐ$vK<:~Y)WsН܍fȾ^C:0e\(>՘`,=xxXBҷɨ6 oe{׽l{Lq 50 djD>{I(̽;bՋFLWWS1lm<A Jhi?n tNO2JFczBWR˴#ТpI4I:eJYF3k5RѦ/j6}Q෦eW>:.sDuSk9,a=`Riǭ^g36N0fwX;fve)u5/jd5|y3e5]Jۮv YѤJi*B" KSUL2UƵ%pK1~zRphE˷);u*3vq.3VwUP)M4K*)K`A:$tyim3` {2Z9hB`/p+@j/j/ !AF0L+tC%n xJ< bM2*R̂'`8eSyo a )_o _lhyE a/Eb;ERWUIHS7բWQWR924 ZNcG50 &B;=, %=eD&y,s1U\%AJrTȊ;bTRxYi&vM5ڮQ^#(jpӳ;HKI4XeA\.g)np1H'fX`+-`* )NvDc&3 $͑H'MRMOG lRTs)b46kb,6H0c)δX0dbTRP+,K,!=("JW`q`"d>΢$8,J(:bDFSJ#dSjʎÈW106{̀AKbK1691`i5"7HnCA= L%9Q%8J-hh7!j^ϒ*P %) HI8@JH1lz~t΅~Ë*$h;vM%jX#x֛"e|$w7H2}:g3giFa1kYPm u1qSo<`/^x<[+Z[a.K-LR{uUk1EۤwAzyϴ.G ZgD0/([ PHq~'֮yϒFdع!$sBr2x`bc0*2A0* l1Y94P=r0Զ5(X|P,,հkN6.L`-5(9v) 4 hNSmY/qΠuqd( i*ii !"ԁRZ"y1EiNPb`ؔNPLp#ܖ6(`>fX6jpKsQ kiLW/>FcALcy@ZyAFoJ Yi&l'd5U "`M&' U20ȴ " EI\(dM3Zw޸~2Ody#4_ ]Uo3`soܙ~z?F5wݛ{ oK+\7S€/9f>ЫAs'3?DŽp oom~u41y]gO}q0yh o0dar >0\e>+]wwj>vaRqSS8X4<*ܱ~=n'~_iK/A8Ϯ_|Տ{M À;ɧd[`3oyڿ3a> ҅/̍%tQi 9ӗ/*^z@_X0qo!en/pf8t~ȱ1P<3û٫^i?}5A( 9AxWpT0Rp_@B"|*Bā`@OUd^_ r1]Rqc80\R̸U^Hpg9._YLT.-bJZ}?kI6A$݅#Xg$2,F +9+ YM@== vFomʻ6xU TG,Zጟ\ǀ푨)Dݜiʉ>[U1ګy Be8mgt&)O:[Zt2XnL !ϟ7,;וh/VZr7U!~ßW\׽q+4 z.+1?{[`%33_J fbC u;Yj9b?s moĸ#z\Q2t%0%9X?26RXV{0f|mnByy:)!qp9&:1ZUc<6T ;y)VbH=U<1HMT`I7DZr7ULO=Rrw? hp*h'r L"r> *JQE=S)"&7S)hsъ@'=A33dL"S B2Qin; TATFl"_7^q"qڌK_"A*&elJ>22~iJ")vtt J.Mʑ^9f )v{p&(6=FkYET'v=ՒۡV;=V0PB=;w ǷJ坿'WUxrn*g'WUxrrK&6 %)a2M(XRcm8N:`p= l)|rFrk1@Y֚WVJֿIdoݎfyȞ1&QYc[46]EzI<ԛ@[ڵsڸMA[)k{{roн >3 xc4c6w3GEA{; O(OQǼJ .q h8NXccêvxOel%xWY& &c8CC'Ɇwbc- "b޻6^+kpEs p990p[?+pIOaOL = v\F;j+3x BMёpm|67CZQU|x<H”P*ֳY*.ܫDUd)196la雱޵Ƒ2K\źp`;"S`YYr$y'AaufE]E~f qj!̶_wn$zvrA)\w$Iq?tRx e)1tZp0O9)$Cɖc">L`{׍N3q` 6Q؂:Fc(xKO7tiaܴ囏 .~|~Rn>曫x}<\\ 8J9ӬOӴ^15tfm*NlkP8˿0m6{)fn`I^31gÿVOW/.q2"L^2C%!J1IwdsAQ +Du^Wy]Y9H+fCrSmf|Zy4EA:vM5jݏ|o_.X;׾G?d~W>͜;b0⓾!0Р0RHj_}W4x$ZMD9cdTRU:@ ^yA Y ^|$'F%xkxYS/!n)GͧjUDmy~װ5%k4^6Yj.&vQCp`YIf%QlȂѓM7VЛ\ń< [,CՒUpwv!u$,|T83РBX`@Tm^Bb&p08pud!F gmSLݼpNJ5\]\@YSNm*қe- -2ڑ(ޥFþާxE⟬i 0DtA1XDlT6g"^Tҁ?,֌E:'բ{C_jѪu1 8Gg0&dy96!kB2^pMTI&ARD`g`.c(') `ZĠ. !/kZ Ǵh01>WHg jw[r,dMkn?Xl^swuޛR~!V#s"6e񻰾T3%J+0Nɵ6 s.L,̖-!`H=>Qݠekdff5&#rc_=-dC(ښhSRɅՖsA0elLz}"@}5Q6o#C(qmǎako | T#C,P &x,H4Y(T ƳGf&0Pq)\MeY* M2L"%UgN쪊R\_( $J9&#ȥd+g:ůSCVإq2Ռ = ~ɀ53Np_mOi;Rn=)-l^8maeEV; =DV<=T:[5IrhHc+b F[=.W0_ӟUru5Ej: @2m?OF}q;VʟƁ3n3X2^\::|7X7YC;KW7#˄ۿ!u˙|]o݇L+kFe jY9oumQ S&RzzkR!.ZHo.=&ضװ|"gw2tNG!N{]F2Iv[J1?[y98V݄VYJ &L?FSp:Ʀ|S #Mhp@iĚpTmwvZ ;ۻ)SӘGF[;YQVmAyX>'-o79N)NO5ALd*&t mcܻi}F5xB!̌Ryǻ;25+Ds+̞f_5poT,[{4yEH}㉏$IC[%׋=_u@SJ ˷r}G.oq y{,A[1)mSSmo.Ig8:Cp$mC˸섣Py\z}7|4X&_͟ӧYݎYv/.ُ֟maO=;K!2CT!nܜ#k])69r~C?K0kUn27ut [F& 8ڢpmW).^Ȳ[9˅,;]QaJXTs oK[JU~7:7m݀Ֆ~ћhRIR=k~.8o0[%YLNu PShZh|T@Z{H/"Rʂ yeP$\<ɤ(vERn#L;n1U-euUɎFkv֒{SϤ;1rl뵧̚ٮ`vOTk*J ';GJ)-`0s}r"aV@PqJ0a0G}I0i&DkCػF$W~;3ӌ1 <%2E o$ŖŢuT*f|GfA. *ㇿ1s d$ ,OMF@"È%y1A71h10?(Ym3:8^?BWo\G0IjU6~<>bcYЈ ê 5ZU kNo0IZ2AQtd,8[qB"*Rͬs1`MW>Y"lAA(ȤCw$,$n"Ӊ]̐NzOQu2sjy}RCAb7?{C( zKfTyB jc*9<[暖 #h A#24xA"S)(R(ghD%н$ElJ+-$#8 'U0ndQ"nJ pĢgN(h HiLigE-=}̚-Pvs$9ʭ4{}y,s:N\ Cّ!ԛȌRP&Msu0Tɛ۶wJ#R) E/wsB}62tR#Բ6NReᑯ4 B8n2N<{*&6ZNCOkQq_Z=ǭ0Xcub;/zʨ IZQ9͡+gb ܁Fϙx&%'˹bS#=%#Ar4ArN1\_z|7X{sx=9C9\osrNmV;gw̭Eutp3B\ 6d&m3-D0L qU"&[YkCe\c)\CC4g 1bJ:"QKI>`qM^kmWYU}+-۹XtF zZt%P}WΡSr<;I7)_"'60kBPԳu?D h:V Lbm d+ #"P*Kjːq- "[0hز1 "Wh\! ,||@҉򰵋.Bcm(|h9C3 |C*lVW59祴VKHӠH#,7;N ] i·:" .l{ >z~o;~Q0R4:DCMZ~qre YIвe7>RQ[^'ʞR(kli)*><.H$옖֌[ OD^Q xW @Pt&qxs.9Y]uϫ(x_}$0c$ oc[O/PPAoy JC=EKJ. 3GAn4B@9U"$7YbNi! :w*~}¦e(k(b7߳mg%hN*Dg\*9bQFP1^4Q04#nWðu۩![vؗ`0XGɤG1b&FT8ԀaK&Ql_[ꭻ}@ R: T%/Ҏ# %7)j5(yye/LuʆCA)%z PA_~(SwwgWV<c?JoA\u_DžIhagIWTALaȔ!DV3[ƍ&qVޏ;IY#0luQjXROŒyVFGy}M F:*4:ȷ8qstkY~Kƒ'i=kkm`p0.=sC9xDCj{# Cx=zhé]7:jo:_JK~W%b\mk_ɯ֪߭넮ff" /APޖ[{QCtЮKT+qV꺼|=q̐yA3![!W\^/>>U>Se/SU #)I- &yǭ1?&SES /=!^SIIt:(f$oķߙ=͹xjxU?>/ 7Ej).-M>6nϪϧ=81쉉7˧f֧?dzGfSl44Oe՛G҂)LPK-i~ziKۘ,~m)8"4GCqq6@q$I`E"1Wnːoh#< لR!0 .Mm:]*=]1L3*vZ6~Mu`{y R-ᚋkݨl@t/ϫ!?KDMH6hRYf ϣ\!AQhܳC*@e&:hP4j  r1XBҸ!z'L#w"b̩t~,`Τ `lPOLt9^s0[9DAp?A/WʥvwmЛA`F uk 6 mI=?Ao Nqz #5FlNQc ń&xIjb3-_oaqЛz¶흸~CAp 9[E|QueWnύ+yþQu]fwN@jҟ# ׅk Xsdr.iݕ%VRg7f z(ְ@-\OO{Oij^B%"7-P"4'MD7k0Q HCrb{P$|OW;.ݎf阽؟HR-_c[WAv:MM3&X="]F|-~?ŷkIZ\z"ճ|88hJYQʣPĻ=k"˸4RiAΟ+i?l0E}7W/wgΖ{~#J 78|w[N8pkwTcrdv۞qY!T405qWw_`fM]R<5)\Y rSoTsI;%RF'"xNcW@LyƝY@0Ea5iMväZ?ַ\ mfu7u:,Yvm4= $A23;kI/PWxvqZ~9RdqIJ/">3kc5y!UTxJ}#dΪK Yqe\ I}>odJ5Vx?~΂\J>F%Xpmp|>Eso$)I@^XeKZlR1 rU2FKlS Y,v|dU&p%fD6C'r!r2~myʙFTQ6qߚcED vGkM"Aӟ_$ NglȕI~F.XuP%YsKCl̡-5$'V,(4OElYXIuR1 -I.Alp)\ ,_,1 \xcOXƱININaV\m]utBcp_G.FE/x|ڇ(ވܓHwm.1 Tt"E \J^D$>JJvԆ % e RҥED.ѐ 3%`7)8B5ji kFrQa_\WQ(c_ ?_{gIY7|9?^אu'LZScؠ'3|̠ zxc1>>3|]vخӃsq?jQIU54ߖe8<6o ;R,PNWJQxǝJYf1Yzo xId LRLϟM"Avd&pFdF_5#=;%aбS!r$Th8?JOSҲk())qX )8e\qfL ޹ԟM9iCchgC"$NcASbMcvJBB)fr:/JR(, Ks҈v&h2V1PL4@`c2zKB E.kKbSc@4G~I50jn#R%G99JC57*C)XX̤䜃:QyNHg@i|#`֖ " L*X>P$Tܫk5A 4rI\p .mDvuJ{%$)c`5`/%b`!m5//=W#π ]QjكRƔ9&k`bc'1Nz5ĂB4^PSEzNcuRV80P'TiN{0sWgcΕ^//vu:l-ec \MO`<0ɽ8$Ρ! ud&A0 $^Hű\L,%yt:\bt1mfP >PnI çߜ_]߀9}(@ekrPRQ4 yN1`/a}>4G:tlJfWDZւͤ/(yc m8. S+E["*`Ꮗxy.p䈃0+JG fQK-tsXY IHܐX7h"JŁwܕ'Q+OH{|<#ӫI"*IY)1%xOIVÅ^-f@U 4PT+jӃsq?]U sTa Xp\+4o?Пu^1s0LoFM[u$FbK3"$bIqmDc΍iJ~yFuCnT?oQ$XE65ȡ!!x{CLjyh6诟 Kf^ 1"`0&pH<":1ֲฏn Q$Αa#,qiC,"P*#P,4/ NR['0my>|jx18,mVsTz_HD,X H`Ėbl3BZ؛m5gH.ۛLuF~>+ obd\/_/? ~22M_O^li/r߱<"".M+pnLS\+9v!{_EՕz~LXS1rY,He}D)#aYcꜲ# E4KnV:v˥A侣v➾[6ڭ E4KǹDtfDK}G6femi-Tօ|"Z{iUpre"w !xݘTo1Se:3 ~uSc*QH̳]"Uz6)j ;Cl!IeGCWұ2CMub2GLJJ@OaK# fAe%{xIJR9`ƼV:eʨ,,E|)vZ03IGf yڣ5 0S$6(b)>xJխ>W9*@ƑD\p>W+SD$lp'aN_ };MJR}Da:FP<#+ Zz֊sI{uUesv H j fh۪D%%X{l#1SV8|QO1xiaw3[tRٱ<ѾE}}~Kx6[7 -=2,Z pY ! ^[Уf~~uY ])K-m>/-krBj1ep$h xݼ,ɔQÒZXU+,PɕØZXVc&vcVA[m+k`et Ld2uFk3+G0ݨ3Z+Q#&wQ*GgIʋ1Xҩd2X'1=.P`K+=\y`JV;Fh،O;ϴb)Z Y\@J.(34..y$IظoT^>$B6{oo/]/?|:VU(CN1v•e X^Uj>Dl}8^8s< y ^~ngȊj}E<^ۑe)}=ȔzсJt H)hP sA-F\`|`~)c|p>tt;>D;M\ŝ_\[sEl!@FȠ .˫{_.wbV;/e אּDc-nTu4LyOpB5k\%ZHbɱ]2} k/e!J䏴#Nc5Vl/+hhvy]wJbw|+-zAɿgi4iٻ=s=kϤw^,G8^ϡ^ DŽcq]Q_9c {q86xϱX]]ρ6{)(]&|GcV1ogc<U+EjȤ].ֿw=G aw͐g.-ָc4)Ş9>oW9Q>pba(.@NMDj4eHcd0!ڒ]޺`;|UZ{EWπDB '`(ªFR@8 bk}$ VRJ73AWPg6dV[{N _4 ̦x)# 5RÝ XKB%Rdk4(A$.Yx* 1OĹ$Jd~lĴ<9bլ5a%ʨۏe+]a!r'[uCZ_?=83KTB/?ݭVRoțwX- ?2_DDH@:3f xeϷu3 [Y޻D{ N~ m[{bAW(&~W`;`f` ^3mb#WXT`#xNҹXXn :hV zqvE*-.!IY!gbޔC`3@5+✞%YؚiU""\@gJPӚ )%{RS]ioI+?lcD)bvs}LnyM4E <YEKbV EFFdEFX9(=Ipq@_ q01! +3{O)\*rʩ Ib M:*"XpkL1pXzpcDPJyM2I38;ЎE7 B9039##N`09~"2|$=GH@a]7Uk؞1xS:MK=#cmP0!gpr&D_YS(qr4D1,/$b f9/1IDpB/iĬ7)pys4D iJ>#&鳝ψ(s%J;6= '\HQU qI٠Nҙ8K0/F$A k9#Tzmx\?/O 0}CKRBOtbtս QrPr=M:\I^}L1UL壙t|b0VƉb_u&n6|F77UA%jsnb%%r'qMb+;w}odڽ77a&u~S P!uʓKF{؈[lX,`(qݼFC9fmfEeT\9"D\c4Q|B5;-oTg N8V>qw0z0Y&q7NPdHhH)p:tHf[Zhp;oȱj.sxm2$G%!QӅG[fݴ_Ԛ$(BBkrР"~ 8vʎ53NX+oHpzheڇq Lh -=H:>=RM3f%`YyM(O9}'ј}¢rC6/>j)ox,vյq]f(lO%:^+L|G ! 9'."Øs+Yf2k E'?}9@IuL.V*hVmrd$'9ei.LR \]}tz?ă 5 &a0fVcro\l'櫮7ˏtC5FZkI~| .~Q6 (ѻ\}>Ff1mݾ`7b!.anzop9s^hk!>I_D;YrQrΪ(ZɘF9xi=@:gJx&Fsg|^VׄsTTMtu%Ҳe>"aEo;^upe!;׼;WE{&0[*~U9m›zHU!XR\rSՌgՏU /fU>i?ߋ_B ?3vq}Xk6 +)9'jsy xƯ;ow-rm32i0I.]Iₓh,T1/?BUyHe?, ;>]x~MBd N`c4Qs[,&܁FDzƄjL1;K?šBbJj28f-Qƌrfۅ3P@"͸"'HkǺ" QL:ZLY?q~qْ8#ԁnƑdcܽhnݻ$89bG}nO P ڿ^;=n~7drpjSqs=%Q0;yZ^q2M!ZD|~&ǝ-.8?OOtoN|(8&"Gh0XiS]#Z#%T=5ė.jɪJINK5]?%>ç&xJe#O(q8h̤45I=sZxlַyA \tx0|6r,]p3@#ϔ^K.6Z^ު>~8 ,71:1"J },x {5^'RZ2Ieun&sBאΰ!&pv3 b( 99O0.a=_8]g~z'8s5w8Z :塁l.#_1lz\Y_Hmc(LR8.cf8Lg!L~92}~]S!fdAړLK*o3hNj^ja 떊AImu;zB^d-YЪ֭ U4EX)/RcM&6aR1:mԱng3SOfݒjݚА\EStJ@_ŦuӜ\uKŠꤶQǺq= 8-vl,hUք|*SG_ܰnPv -*Fvuk=uKukBCrM).YȦu]uKŠꤶQǺU?uKukBCr])ETJz"{9 =h|>=/[.jƔrfl#7ozB+/Af޿lBopEGθnhހvWu!䐵+3Q{a28fͦi9=wsF)`+%W &ޛW˯="UXŤ\x}W?w<=D~ͩ7z/W_Ffύ\yzk'FQtI[qRՏ9p\3?ݏwonЮ:L{p/@x2 B[w="ҝ7b -yڅE#Qͻ?R3Z!v2npѫ8~1!؋G50I4/9=`y`G;p6̟A$(*DúɍiXD^aF@LF>H8^g$8Kq PeEB@t.}pf!t6FAǭQiFټtXPeˮAV(՚S T9//Ac.D;gT3XEX5D d IxsVɍ fWyUsIc-nM>8vrf[&M*`*nz)&J -R iiFcHd"ϫɑB GHufE!])^dY~C1 Ncpw?l|aVNONӅ(A n_&ϑeEs&KCSHMjќ+ɯZ4hkK@m~mZ׶Gʯ6F~m ;U0G6&B0y1Κ;eFiSК_CƄg\NC~k@S:萫, &M.v5TPΒ[7}s| պP2Pj9vr.v?'0⸴\s/k$Jl-35?%JJ@%SX#WF)}͘SQ"H&,%Qެ`&1RA %pxdGt-I#o؜bo&OFKvSvZR4dVO4*,Kf5h  m PXx)٭A@ 3[1'?/n8 `B2Ӝjuƅ Xp!&dDF2#2h#AOL1vq'`@(AB+""5q0%qfi×f6HV%0#a ZÝ3ڪ( #H%'8|Hg…>ƺt1S5x*cAyȈ"C%Ȧ ^͘Ul#RbO^q,G[C(^Q~HcHɻdWyڜ 5}9qHjԐ\뷛FDIlRv(KWWUWWTg8&XBnI#5;*V,dE)HKG ɺvEw}4䏛'Mrmz"8fqUe *8QGm);*qiη7$b>^ sBhyUk"Wa\zac}VtucwZ}Bo;e\r՜'%^_oVjp{q]W-.]2׵~r[* ~,VInVI֯f`Gy],)u(,h}*$uZaTUUeD7TS琢'"`.1,` fU%L؆\j*8I_%-CR?uᘤ| iCTȠ@]h[e\GrJ5F4̕{IgrQE^!DcmnxjY[1nmqJo\Ybqcud?Fs;2ZFCѫ:x+k@%\'6E%?'u^(4˕N Ա<*Ξl1f:ܑ?D2F4._N(0 ty=6'^t>sRQLǐ(_Z }p.6@-T mF[V)yAJ|IS9;BK㺃 N]JhSj!\,qIJ=[;{ TZm u$B7sM5r_u"$wxכ?e !3,tX@Ϲ/K-Դ/oQaGBKdxC8b&*2I1oQK>d?=.@/O82Gt NuLNre^ Ns&=Oo%%`fjt--cCŽo JI[/x-KuF'ؿ0ܦ[)um0/bĶLu>x$9X^>v!u$յ6W7xrm5筪8R ;P:bS![AT KGݪP@siiJB^ bzNح膴q("|[(O֯^4ItsN _{Gy:?j2Hq<3k굇?R).&GE~OVvRN?D!auye+E,@e#zW rR?t`+Yu ^ \ V\LiwؔC-j*55 %?rF1P9Hۓ6mnL PKu2+tDMAE]ٌA;BOڒ4@/ T1sCnݻTP| q eWZX-w9j&_0S|*ם .1Q}f^(*OI~Bm6rJA$N/ǷER2Id9ώuͦj OD<`,PnzƩo\۹Z0f?PN"NjfTׇLO"6m9b Hd^/: ^$/01wT ˬ , wQ!w[5nwB>Fk u䓱Qi)ffmq B,o \gqzcY̟)VMוd2@$A YA>57[E@{G&.:êܝ* `ŠvL@~}* :TwiVt1.Pe"zE1l Z"-t5KA685P_"爺ԅM"-~3mJִ=IzƼA0LUױ1NZ%bBF@/3vs<2|go!N{ $Zm0V)XC㱛KptK%Vr?d77Y uQ!KsVQyAo;#I1({ [xHW-L/1՗Ѽ!("s'2A鰺 D[)?j.w7"){3xdFC~hH)DlP|^WnF "ZTZjxgexBٻT7zj 9Gh0|7WjHC,0) aF0.MJ 毚e6 +Z4 ;#SF.PHD ̓ 3"64ly Qlw\?Q|vQ ͑܅8b^HǥGjn_qRѠlRi8^*/G~:EUN'?E(1(ʕY$K4"Pa;tTo+9?a-hxMHFWn/xhFn47NyUtP%Ffx=rI^-FX'zvSQll4eǜ ^qTWqy_2XGb۾liD$6Ԉ`=8j=pȾù DecIÎq^oO RI)E)]&PENEqS^#TZ9]`&]?oq}yGQ=_Ep-&Wӌj;P&f2VIF} gO;_qq;ٚeTݯcQ oO)r fݟ3OF)faf0Ɯ>61jiiq7ŜD|K-YZuQ5ZwQ% Vߣ<[rGYj߳d&ǺQ~kmV/8kES " \G7Hr~f%+.WU@Z7əɛ:~xeiu֩B(9"d' 5gGV[!U1Obѥ'^ uJEfdb[X\týSsl<ˤO0 Uym'QGl $\ak-+A֊ãIWNa <;'LZ=%xoo@5/Hˊb5/_|~[Cn92-)O~i8XjɓX:Nvrn~zcA;])H9\k7j*s=Dt2Hq&,NvfR誥t}}/{ ֚JSFۨ+rLAQ>j) A?.D",T f ɧcDVVgZa 2z3ϥjY<8ȇr{//0ۡ]+* W@\j}pV^}Ջ£^^uŋ _|7}e|/eo;Xn5sO"+S9R2 T+WkmhvD8%D3OӒRTDhB*,Za1.4[cPO\%8qMQ8n@Cv64L>/{j%rH"0R#irM ŵp4 Uiq1hC6+4]]D>#̎dz鬢,@2XZ}O']orp3mme>%"K@?}/1h D4" ˜I[ 46+4rdk|W{$=8JJO:KA68k|NV(DqHQ-0/j7sCe.bA̵^yUͩ[}4Ν <e 0+hrcqA(JC)`8!ϽbC & p 0%`TZIz`5qo&㏣LXnW|]vXTrhI.2.-\נz۷7sOn"*"nץv]WzXƣAi{0kڗٽf>zFF zvUf=\pzl=oDRSK_m^2Q˕ݥ5*9/tBM7ܩLw柠*kNq]z6jʳ[vjo:(3Љò+Wل%ȖMZBpJJdo8jN!(73`|>==sD,rQf7qSRxۆ0bBLFDcYpAq[(]<2Ta|&z>ߗ|H^K܎o, 1dܛɴ9qږjv6t.a܋Yk":YEY:Ӱ,g]I4XA^01 lЛ|< [4mO!jՑr #=6 m*J^lx/`qx`P$dbr#RPsw0Sm^yCt DŋMJ@J7 v|y>t֟*wC+7{G3@xe@R:JU}A j3% 2Ҍdc>>^HZqO͘)U`W1^{^2UЛjGΌրw dfWaz3@5seŶt]-k>˲]P)#]0- G#;xns587GFa;Y=>!Xil@=y3=lJ Yr~3`&бBF#q>#~Gl $r 'xKZ]mkL˷ocq=>xׁK~!f+6˭SeVkxtJB .:0"AV*WO7csSwfoW=2/k,ӣTAlw*GS{&{@GAaYO|B*b2NĊ\nbxɱ̩.9XX[P  HR4OG.6o<i/VEM$d2sDcLw\ϝWYN^oP<:,@ -f'v~R!`dړrasXZt] Bd1K3F(όʹ``#t6-n?!G'Jkj,81V.]#q4aJ$OQt0)BH$F卨ջч?#ew(&;^PRx1dfF u Ѡ {r5kޕJ#  *GqBzV$HC; +] ϶!vHP[К"O#A3keŧ?Xih*O3Ș# Ԍ] xQfyK*'T\<'0Qϵ2& VId|S0F6]"|Af&3ƀR0e0Ɂ-!cȽ `z<)KH %G\mEAB5TP&Ķ' *FD.|}}ݢ8xYyI浚?JBFˀ?3b0)?YnD`%o]UA.O\o^ t:A{B!jo P4h_g:cPkuKs-_~>#Ov(zG\i$l^bW1a !޳֓s9->)ڢydj:M 0 0:k6Ӗ̪Brl GBxZ;^2zI C|Ѕ6XhqNO*Hwxn%4ÝO$9#Iu?w"f'CNd;xT!)76f9ם zvwXx,˥Lx=ڟ b{?GWMC"Ƭy8ȗ&o8/ʇdBH1LTÍ) .D=w&; +Ķ@+^1gãQNF;;D8#Z=ݹNўٿ=PLEI%1{[ FLИ-v=9D/5wFFRPպt吔9LC4H8f= 3"bw"Fbtyo$*$z JJjPN+kGB"%S}؞h7I(vADH /1Kb'ZT`_#'MZ*-!;Fvh!Hw-zڭ hLV'{bILr1rX"&WQ7h txQNM:㰑@ jȋkOmq8y|q;qNBy;DiD5-rn.<9"1SvFy-"*&w{Uar9VhPTKI$B8_7G/r$Jyh$dw ]ULKUeN]lWte[Q$EQd+jL|}-^ƼJ/t y"$SF%nh.hVԈN}m؜vk&V !!\DdJ{sڭTʃiv)²NukfBkEE#/V_^I+ 6sZ8 &18B*r[Y^=g6IA4jŴ+F c(wefW R'9KN1F"%irQBG0ނ@qnVȝ!dzc~:=%GΩLrSw.nWeVᒡY 82g`?4|~[6Ώ)I4 6az+_7D|\$8//t_gS}]K7W7~szLTxS3›[Cl:F|o10u#f!|]ׯLja /S 7Ç𜜹i4Ns{sb:q4ylZt5U:>s4ʓ;Og(Q>)~{;X熴 \ٕkE?Ń?}[n29|~6r%@Bҳ5=Mڠҭ=('b,Dm[n;7W)0 )!!ٶ{x奺KVd$,DmaWAVy)÷nS)v+eȝ|P3߳?0 Bҳ5=bcmV\Apa, 3elwN+C16*C1 ت6$wR^C_Zm!/Iny~{O7ƞN~ Wv Qp1G(;xf!v~7SI9T6m+#m_MHpl;V)60m{^L gӽӇn &D5Nw89M҅&K(-'O᪤A-O 68_a:uK݄^cC0/G~=&!m_oW[^Y -$|>soYTʉ6)) 4Dҍi&1> Cf2$p:9OOg.NOiqN0{d An6ƹN)([4Mv+$LOm㋿:2R|欬iD&|BH(_b0d^'{1z,D1>0BяPd>ޙu G)ǒa^' Yvד>r1dBkJ #}r&M]#2B[0p7J'F;Ct9_G ~8kBQG?˃rk)c TөkK@Yz O@ZO4|Jz $}Zݭ>@9o_?O./ᢅNj+#9 $hL @[NLNⰮ@a *uAgfke\ ^V }r06JJິ>90TGZA`vX Qh`, +V Cg 8ITkk{o? `X'sp@Y#(8AmHkp3ne67eJ[?ҨH!4'$dEuƽa{*4d*@r{9r_>oJ %s kZ ,[^TUOlt S~ꋃ{drY&!*qw/n8Ԋdr lR7c4$xfr>K*.MNO׷_՘0i7Fm渄jK渄jbHl PĀ#wkL>1a{?MNт!fn/gXe=Xh̚dܫ9q:ۃ3HDMcIl|e6'Hm:r)~q>M*@nP-ם޼`wͿC rdƂVtl5x<2&Tl4y DiT^.q 3.U ۝6qUYВ9^X\3WkA%@paU})Fbx{~Jwl9QPjگN(;Kۇp-  )ԓ d[>Y.Z>`ٞqckjF\hG24+H%wZ&:$` ! 1 4L!ڬRBlO~*m Տ-1yf$T(e-2qA7m[gK/-j.<*9M& `^0 >AEhdtNrfhҦz"wFUHPŻ᤻G y s|o 4&RO6^gi1CiJ@0se O'")Y-U,Y氈-浘>O Z2ˀ˰{/Sktt)t \[˽@]xJy/PFM7@ad br8{] k;g4#0nw_Ԭ[fӱuvI`4Ez%q2hȝl|kAJaA13c7[oe ,TwÙqaH)^@fQ<4 bTcQLA&iBNY3g"ɓTNdi١BrÈ@5Rq_x5sM,<*Q ԬtlVZx2S`o@$1YgM417IKWw 9O[TKx~HAud\ E05/ 'Zӵ԰ )?:`TI|e"ߐ4onx;GšIj_2 +(e3GW*XzfäuX=*_"$Pݹ}mډ ۿ}m\ܕۦרu,18{k| U5rͤ%@ ɵP[=hW=XTBμ깕"E"Dw%W p"XU gg'y<80koP@˽:QӫWei-lnE~FCՄh[98b9oZdf6yvVgaEDZ +V$ȂfRi*ikrl"gxN΂Rp%[ QF@2 B`u5KMD$$-luApx.z,JkU鮸-_>M3([RMpԈ(^ɹHFՏ +`8[xauۅc"Sh^carm\e_g|܋^&(Uyhe2H0fd\f#\b`wͬcuwN=tYp,'<!0Sy ^4a#m_j #C{^@A37H_ |ĹXdi4-XllF5RPTɻ[\i #pqX6Y/#(&6t-)P 1ɀ3JGe$*HN Q/3d=K 1$U c7>D.2sp*"IV6[I3,RN>>eCh0CpsJpp.V\ⱊ-? TNLf'-d(Ipy F>|كXh%Im84%q &Q1P| lD-XXˡjżiIWkK.0(m ÑՆ/ZJK6c]5GLNo 9XU6P%UxrJ~\r/ɝN|4 ; 7Y6PPYRJ A*Kg#4?x v(7&@@.sQGD7F@|Ef jI5)Wh9^CTؓTvkxś)\59BR;[ )"E %Ev=ҀDkonRhR S@M+N te"5!~ƘzYwJ ')P , @B6d5y&Xsrߧ(-S֤gWُtD"gUe;j/ٚvp)Etn7R? U/$*(O9qtv! 94Y|\4Zr^qcʕQ*칽]\W~>p5YB4l5vgI?ً8~;+u\ҝ;ՉZbv7 49D2vv}٢ BJoև=tOq4OXIo(dsW:Si чZCZ=s}B^2 E RGs;oLn(_\@-@|LjgqHW(hZ*U[5&\=:G0oQwE6F޻!?NbWýv=?ٗ25&/(Lt7>PdvvSoZ\{vy`XRO+Q9MSr\!Oܹ|5[ Jlr] '-*a)a]:C?$꧶kC=^:5a.s Uע Ȟkq?/}, ˬG‡NNz^JzԤC!0F@^~^X,ӿF_ތg7_Χv=p& +ʡ2J(*Z:KY}pv~ggӉ;_O}wy7Z<+\r4]ڭYLC)9:64JUZeGO{s4ML "wFY~ z@n1Pb\ lO˕~hw_OU7i5_}5 g&y3B woW'r?^EMd|BR ٷk}d<|}@p!G72B{so|M&n6NmQ{ss?)ϱv5c2R{eMg/? ĭOq(!;]z ڇKx֢Oe`Zv>wFVgn2NphwM:>;4<0:e`06wp{3 iɅ " t+ n `8M] }Вko)FpfEf;a4DhHJÂMHFg~.K[d G\VώD@u9W6o*7&~)uMT%vBu.L-sK-=iz/%K,IDy^خZ _ᛂljOonx>G>p=dsXR~%sgmG\–pXlwV>@}AM"BmS%iLA0CphNeJ$8m:G'?Ԫ wud:Ͷr68͒#FChD)!օY!٭jMRZ2祋JXgRPjRCfT3'{ IiChd2IJ4p# .MhjIxQ(Ū+)5L#XTmSqZjaUs["%1,D͆ꪾ쬞64G}G4U|>oxw0A.аj!׾RTk@zgձR:Oz';>O!^X>a*oݐh^Z=4=,"Nש|՗РI%W頳R_JĔ3Nk4 ;rkA`u*q<2hȞ;(etMN;Db\&w臘fiɮN$+ף.F=1ӇtR,O"IǒHN`qjXptJ$>rADN+) JĐQ;TntSѢU $@ qdcI9~A7K،puDH!&.a-FACNxtm un͘kk8/K1T0PD:uP[CvxKjV{%KKjVdtRR# ej `d&(_<4NK#5S6nv/X/aS XW cWO<DaBa̕Zu6L`[k+#WbRf側oؙ6,.,[c*S]s΁ <%!h"++xhђ W$Eni N hg;jF:j.IotY&qlHBi>X8 y dt8õLS06 W!etW/'$TI7?P_j6^*xcQluqyؐ YS*:==G#[",3-lk_7E(Dwa"\W ˏ} BQ&mWk\ZGR W ̲29ݍ3|ݕE3$7uVi|P~- U_r+SMx]M.c:NygOgTˮ󮘺wJ7\RpC^[[ kpV%;ѷa}igdyM6O'>>X2M޿9yg;s}g&ӂEer1Z[ɬHHY(Rb \!ɏW/2+lFb*j "([ U\Ά[w圣ŗ]= '?z\Ao7/K$`Iv"DHAoD xn6)x(/!INcIm7 7m][Z-&϶jOgcz;k+s ceW+Y5@fe.m 碿SD^ Ă hT͹ƨuRs_'5 [~cN^j9e襝{~/6G ζۿӐg:\IZ`5zp6&FE+qyIUp7 ,ސ0\JIުy^OZ@¨8˅\~xPFe*a BU_uhzmL5 +o6m!~ Z7Nk$eO񋐤~}1x]\C9-:BoZ@~u< =S(h%,xO}/ѐF7. (tw>_ӛMyƶ9zs<4b\ t,q7/ ٛ{sۇ"* 7 c&m ~̛@!>M.H!#I րWNSt Uޢj%TeoouFTQ6 qs wԒkp7'QF73T Z9wWc\Fr;6]f ƸW1ѦmD)L*XF\_]_G]mMJQjj˨ ٬cTU+KQ0%8%  rʑ/Ih&$ M=!8S@= 9jSF!IeACPu4M۷E?KO j~ i' 3LehdZHAd7mIVUutUw.<7#@Y+Dž%m`uC\) aXc]%7A :y7/j_ Z7 D5<|Ԫq/nڝ"i||QC,Y6Je$.쐧W` Ƴo9-7X_{ӆ;] ~z[7vW~ˡ GWwU.w,H?}``SDP`o!ǗC'maU&v%1_^;f`1Ng%Rj"| lpCGzYnՍY}^۽xܩհUYMkZ>0j%0o;s@v/3cB*(}!ni<=8 V0:pl! xTπtMQ(^SPN%Ыrkݡ1c^i*(5*3^Wu^0v!A8XK(Gz'gA&8^Rs ˌ~/ʆshFtvnMG9`풋G w x` IFf6$$GD5 6l.ưRȑ۱w(=;-bZj:0Q1=%.HT}}Y.@PtbhdtzV||:zC99cQ==;<H# j_? { dAq8s*͏Ǿ=. D^_̙Z&m4N:5YÜu`:>8bD #ivq3C4ERG َݯzxơ+&68@((,PPX1YRIBٛCfQ/z&n*X  $HGaK+<=D2^ Dl!B [˫XƇ1VM {Ay/m(b<\fMV{vOH2;HI,GE p| D4Y,?Ԣbmשԁ\Lʷ!iP͐ P#uK}#$&QY1(52JeH2= .Al@XT\`*߀.UfoEi Y,_htzV?>3Hy/CSq{ -Omԭr.gl g<#)YHBlAs:o4QHv-kpKIRTi޷>IrM&(%m{"ʱ=KJұKإIRR _m7C ЧH%Dy۲Z:)Hcݳ.l{wԠ}SE9нvT=Z֭"qu"=ӥɰELaܫֺy]+$ Jqeo$!Ph9E|$MT2rd-%9E`']p$ME%39'x!l6-qz@t7.@K@$^s*US;fCX48QcҝŨe yH9 ޸YuPxm2h׼{f`h/5n5CSrNF0 o-zvGE`NH>{A) fO⻄A#=ֽS@Hsv?SXtQ!#T']3Rk w.c du4A?%#)% IO^iK\Oފ .O\<~zDJ$'Cf+_]Ӯ>_7'b];ùֺ!Q4b]>:;rӴEQJ)wH;]k ]r:$54朳lDqD-9']Yq]@s5論> k/ Ϲ`$MSIb;KnB+p##͉ ;s*a ̻:Ts|UygjN6%D*11r>A.| tljs`KT߂wB)%Ok$iwHSQ 6媇w ~gj&ㅟT8W+X.r?>[>(Ε!W|.fR3͆PN1a۱4 iF6o^nDVsbO2a#m-&V,2B oQ/(3L1údh;IA|) cm׹ߣ(^=C@CޔR%aXa#XWA_%sbEJ-֥AǞ!{ TEBsK_1m`TrD+3bR;dP6eYh[Y%hCEA@9 ǃ}3K 9އ=o>Y!rNwޙpѾv{ 2@n4-p7oaqA{uÉeYF]N7XnѰqpҹ󳵋%gkKzvdk.AssBgkK.VϹ ;=_X1qvk+qvlbᳵ5gk@]fYpĪkk_y^Fu f5:?.ւ]%9[8}nv\bƋ%g/mD֐ jɉ.(8W%۽VW̷o_5:ު<.D ,9*/],oj[>KvXVu/k ^X*^.֕Xbel?xr6?vՆ^nnhZY?x} /nre4r,S7[5t-:&m8h ~@o-FvK!g΢1<%} cOa[,uX'Mpquz %nU5[ 9s)crMc6A։F6\3RpșhO01v#vŠ~v.4?-vK!g΢Q3S1 19cN) = jc>3Tڀ9fA$e9ǜsI%AE0,('(s9$Pr-s9|JXMb0}Wo*."lG?~-X,. n,+vi׳z ڭf@gj-./0VTZ@XL;g#%)1 pr,6Rr^MpWn)KYa Lbq' ؼ%Fʢ?" T$HnLB[!JigsɁsלpT—J b+P̀1EHB(UA5.-0'‚P{]lMڐE8Loi eNB^8Z #8ѭI !%iFT*U v464B ,'$ X:$UUc/K&S-"Ι72Bv~|XVKTH~{Le],w6.v^rOx,}1fF0\sw,5?{WƑl},}؈s T1#?Vji$54 VEX,ohFi@ukLW,  %qS "V0e Sp,7ḫʍ(B6{t?_78f4j›#hI,[(d2huxh{lj1eoӉ ͸=F9z9͵I#c-}1Qx`Dz<6uII{$o4 35z5t n8Mz[Mnjlڣ׃#hhH&x{)"fG3/˿s. _R)+Q\)~se5A^lqi`S$\ayKI&JG4!^y ˼AL*)4gDgz"k߃u5g>`ҧ5Edҏ (xo}{05ֳr\pYG@c O!$_f#,v]x_)]k )SNN)D/.1^a@͟{ӟ__ %ʼ %z\?xߟ -3hȇPǃa&$8=yFxǸ$N[BFO偍X-kS&v'~7鹬i<&O?հ\%Ǥ!; 89zCYF,ɍ>j`)3ۡOog_k\543up~~ȆQsso_P;ɕ(4z"ɰen7 g=|C~vs4g~~xsTgMC1LFﻷ&gMfKYۻ+İ"AbSJH;~hFWtQQ9?T&{ׅ>1CM.}w0W^]V{<`3X:_zw\wFpoa|OguM8_LP$./vHXKGyW1Q@bA22Of]1Bs6ju>DɼhG@2y,ޫ{=5湆ᇶd,m}7\&m@c"{!jԪ(Fʨd }ds-2O% -- D!Ićl3 |%W~l {׾?5U33J>'ÅT`zhil^ iFWgpfˈ?ׁZS G\b2uMJ 3E[2qѦ6 sZ2lJTɻweCziSu5^FX{LW'6RPU8Uo!b aiI0qY&(ANJ lifw!v1JK?W n.E*iD\"Kp1PUΏ*sp*1W* Kİ$0 L}< ))W\O0Y)|h2 ->SBڣF !]xgj641\@$D 2Q?b`%O2~4B'UdPrMݥΨ"MH~ܼ΁z XT,;VyzK!6NZzֶt ʄ{g*o_P=D Q2_Wu\5d\S/OX_\ؕm; *"Dž" QP;\(ϊ o{Bj N|=v LK.S{Y3[x1n3yՙܵ{0[27ә@A:=ފ槏f2B*ETS$9iV/W>'B#(OuI Z兏>zW}DteO** FmOKɔI<ͻ5[#TyY".-2Zf9FkUR#şt%ˋ`?Pv/ ͕&!Xq)059R՝/S{ZV@&J`ۢ.t`񓕭=KӖ?1t:D iX[]ߝ"xTh4j`BKCsN.$|N!= |%N.|)^$4&}t e " (,Aș==O\*0&/ ې޸*|2;ZӴJH41xE#SkbsXYt$#`pSE:o'"dWO㷳Sep5{]tIuI`iޅgWN0u:CqVC(CnM TTM@U@x[swE+Fܤq2L9AR`4;y#vNT *\4Q8>]IDs[ѥJY׭[jPڱYwگJ- S"v1uT5]&\Ba wbˈ2$=mrcN3B!iԻzTb$kbu)0 +P_QbJ֓:Rl-L?ʫxP 8+VLnbrۭbrxZ;a aQfK8! 2heϐ>]uJx W,Unv:,ڌՒ wbV4׭h[\8g F\"'0ˬh h\Zb*o2A s@lWl-Q0X>Lm[:slx駾tz#ĜN\U;dg#l ?Mk~|d\\b*ߙ{u2 &c{"..*2_VsZ|5M%Xf3'JZuԂs:3Fs eO-́R#\`j.ڪ0/o L0V_s6X2E1g"D  S,@ Y*bU[Ԧ'y0C)GbzGO޳[ȗYNMCZxS^_Mq%]o y F+n@7.;nDDHZGOM-g*j逻7(b{=|$\77o&VK6g V`H3:MMoMAru4捩;RdMS., FWƝ&l" Кq]cZ"jևj(&砇32m"B1ӌzQETp&{z$!¨VxMIН覂yCe$yVrf1dL, 9 ͲJ >kȤJ]ʤ)ph‘ %X99DZD}X$L7]Nn/qaBuȢ0(?G^OEߣV)~<0OCk3QtpTE&>wR) ~daW"SGbvhP{Ulk';jbW>4NjՁ3JFxe{l;5fi2RKe&7{Iym }۶vG,Md9Q+wmH_eo'"$X`Lvn3֭YN&-nI-D0H-vWz!9ǂ+5z9"xti16*]JɈ_JG3 &_a+wioZ`h7g1e~cg*\Z^Sā=8~ymyϋacAk0au ϧ{>-.rSy7 6{smzX@E=$no vK3)MN0PXP2Z Bo?iX`TC]ȗzSؗlzB*d;YrUONr.d+y7? qƺ lj 'psI9+9:L* t֊wLBJFPd,LcmBu1룮 TsJ8dv x[\Ϸwh:ozr7k_ҔKMAXxc!e;KI}t.)N=X]Nɛsgg1ViO5/~[Ekd=dz$^_D*|#:F{i^?ۼ-6Me|Ԯ\~|4IIS-Sa* 7:4w/Uw*{.|̱jVu6Jɨj+#(1/,(h80|[Uֈѿ +O8$IףsK6z]nukYnqex8!"^v:0$m_VE}b7~守Vq+)-.Xpm/E_EߗiXJtof( JUG QT藭b=:qE:Эh W=߾t!cS=еÃ$NS15Ĝ,NՁbyTR>Ox('}^D ZMO[05cr Ŵ>ޓa0 3/]rUl6a<U7]69g~;{ HK.w*2}fХp.ƚtj:xȧu$[ 8~`Wf惕մ75%?^_C'4THĊ sӵp]s)"ʚ6*kz[;@u{Þ=\;Le=gX*@y(Vu1q7,I. v^ѨMΌ!(r٠E2Zŝ,ZI;"h#L$IGwhJdH/q5>7ըzAY9Ñ\ O-bKƅu#}`9ԼUeZeweT̯ѲcJq]++dTU~i&9/5333L\ :JhP A$3x|cy.l>Tk˨wU!^>T qGX ꝣEtq1/8k=1h[a + L.f_KcDX1 .R>T:IkETQ:!f\cĬ`V'[ J ~:V]%:G'F!BiReS=H@6_ 3S֥ 6 R[鈤Ni4׎$A e)9 ,#AyݢԨZS |yU_s{T8]qAZ==ZgV:>R1_\ %v{vْV.eD".oޜ5\BY?C~Jhf?4M·$2!2g]iz5xt89!h$O$R8 ՟p< 6-o[(!cKhLpu k}GMġox1M+rF~-s4wp['2Ngʜ0 :UqA b$7̭S~ j[LЙI$|[G㵰ҪsXhHRbHIH[;W ykfv5oO!i1Z7*2Uϳ{yxjU훲'DLC2Ia[Of-0 Q4<8?Q~.~ ЗvZmȽyMjsRbl`py`|C7c F[zf fP[1J)÷|M7Y7%*Lۧ@OP:ZX{  ]R&jsj%~c?});k%oytzsbN!k_]ZKh<46^^_ՃBj~jteQnoiZ\s_sgs<7w7T_.ׯ^QDlB_k'.-.~9{= S;-hJ_E\36+jm]/oj;R-KjVV-7gkVK=ƚj J=`#Pwr39Z&&gHʹ.Wofȉg'`U;t`cކ#ZK>X&X@Հ~s2 U:⼰h@%0&FwnFBj`dg h"8#Q2i Q(ƿ(%DS1@V~t2 5pS|@4ۛi<(28vⷮjM^ݍ2^t'w רpm#wjV~ SfMc֏c Tr{ǐVNU#Bl¹ajtO[PJYwҝ//)`]rdI0/YKy,Y:Y?_BO={ʃzGNre3g'c`ES9 ~)>_EΗt}YV?ҏ"iI~iѧPeb:WO;zdjg? 1ٻm'|34Įs9mnLW6 7E>:<i@QDCr❃)P{9,]ai0~􂂥$j c<_f 128oqnafmugķEr5Dž7Ylڱ@Di.#ӡ)d.(&0Z+^D3ۢVqS&!*A>xE"(ʴH+=rmH;1߉uF|[ ^k?3';{U4~D9#LɆ !z +6뚠pho?_5'T@ an~~pWa_0,z__|jញh?O>مI I}Qq2kߓZy4?C~v+yv&;n1[5s?\|s;} \Ob窶^oN[[u2f -vv2ŻYAssw^;P/hV?!> ;YAO6N?HE|R*nf.6/xy?@YӏWhIKn *{6r,˾tɇ<,0b<JĆ ߗdT,R) t["ύ S?-WbrﱮzQFVʈtު9 @_r OLX";="rA bVujk[ $<-ď+Y\Oc2(ʀmgécD'C9иhYוo h\wggb9);vT1 'C-}<@V Mb=N1j\yָòN CYwV0ˑ8Ϻtx~r!\mt>#0NfeZ Sw֙Qh[i?Oef^}5BaR>R>R>R>oKya J5d#%\u` 56҆Y^Ce0tѥw#1 )A*F " >)18: 9Ur9\W t8_;ͼ61AbjXdem<~b.o>A')umSǿ *j{j2~LS*Oc<tmD 4q[ZjFgYA6ՑRh  vG0oJfVݲf*ՍñDs3TmO.fP{*ӧ/ͭv1B9^یp8wFtl9Scj_}sO[uMJr*(,4X`iۧڱv )y!p\(5Y%7|ǰ(F~QdgdDQ`ᕘhs 5y2Ύgك\Σ\Σ\Σ\r(tHikhLL-0 VIZ,e(B8gES=JyJ7?XRl ¼HT)~Rvv"zە7ѥ׷7TX"%ɼno.˫Ore$z 1밁CgPWn)D<"slE6UylEc+y!eWxbe0%&ٚ9]Zz(tݬi\[!@ T|/Srsc5ZK)'2:VնDi@HX+ʄѢV(7"q5JiZŋ ã6\nJx:>KKlM/$|p0/4H+x j^A}Tbobf66sm%a%0J!$9 Nodc+T|&*A%#vDcѪ@>BjlؙI} PptzJ'#@93@wŽ5 nT#'U6J5omA0Ԁ-S]uB.{M]LIu OLl)V-8AyD<>lrjJ]- 8D􈪪S,snm+?,5wIrigRZbw-Q{LvkXUό]-yfk{[ %}XI|ޱǙmcmeWqYPwL>'U;7S8rHc=c?tr&P"~.-){:N GӐ1c0`>yJhDHFLZ5zjX.UqN Hoj%6*9mt;ґ"q=ѭpZI:f;k@LZq*ARý\:U RicP7Z Y$HIXz7g/yGg3::-Mukp@k=uB m)Q>"*k~F/,(P'r)5XHHa& 2.&@@z4JqPuj)W5R0z wZ 8S@J)5Hg: XQӷ߷HvnUF޼Oj&#£$o#$FjW:;Uֿ^W]ꔤNYR9#QrDPb%=hY r8l9Ja'_]?pH$컈6*'jHЀuQM#cO{}"oӾ=nyt-,Ʈ9_ÛPX˝Sm}"}r'"2 v~M?}]_=/_Dz_l~тk |N`ZͿ~D-T.(KPeThGx0 )BC7Su<noxKb@9Z ],6Pʺ>j _OOWe ~&ω/(LN|=mZџ)ϑ%SaTs_mt p?6Bm'rj\1$ OW,"8lҠNbR'{'D{DPCyTHxK%wuOrM[gKLlC8!e#(5|T dze(bYg8Ʉ=96$$0aR?el ?($W:b( t# 0H"@6O)q\bZOF( C(@kPgrh sn#fL ֪ 9#c9t˦(Ң5SЧ&Z,p-.~8 ּpo.߬|RmKr<8='đ?A>r fqW2ÃtoDsRjSD-GΤH aBL AR6i  &$0=d~GHm<bJa* IA%Վ=^5@MAM=y۲58Z{;|67iٺëykYӯgl.J4\]檺 _7<ժ4EZr_)9#&o<:֖qg+47']zXuGq7NAkD OQ|+%B2*bRZ q+鼷mzٲ앑̀ew_+ )_y,@=B{(or7H++WRs@>IK:;P_)"Z>A )4|y@AOG/_'#a@ 9_W]uN<#;K;0$r ƥH1q?TwKsHzluw6b}OIN`9H@=Gf9aYJw/JMz!Ӵ=?i{gLCt &&Uq\ia]eKݶu.vHf/oFWҟHLɾ}g; '839N1O̧!(q~egaI8sdģA>rdcO"0B2FH"G!)'G2sqT/SIWpt)IKLRtbBE6x Dg[.sT)E'rᣎGi'E3EAJ27OFPt<߳sa/$Nϰ'Ӥ5?.|ȉC{ Z3ɇa9丏 32]͌#& cj; G:hb;:VO0Nr|߅3OrܚkN7jp,Q<|pRw힉`Mĩ( ; ;o'ǿpm]Gj8icV*6BR G: 14#>$[Ż qrܛҍ^-_.[ctnXrY|Mgm$G,.Ґ,H.|ٙ;,rqrc]})dǘYy-zXUd~u6 [D&:i*sJrK0u%6Ě*c̄ }ed`ɀ+@" R "+FQE.*tt#(>al*xeѕR6jUK:IT4R%$,)<(&c.OhK_/^$lflJ8p*^ Ȉm`D]Ve1s߹<>ߘ$$RY~,/^凖߿yX-bT>N{9H)a%"緾|$->^ &pDZ9=/~s:UfཿNT70RV}/VQю3n6z2rj,JAk' -fIoﯯieʀK\ 8|%m\ 28b8<Ky  M@z/|S/z^6Ԧ `t襗쩀U'gt{RMZ  Mdv}rl\ƆklIȢJsV9y-ys6p,0*XY+iiJ( -5ƊܮFoIbN~K%>J~i7**IT9 Yh1y,T i P*Bc.m3V`8"jMаiy.ׄЀOJv(v`q;̫V3\?_e )0=3y_ 28#4FR%=2~$.Q)' 25hJ|I7x >[ܿ1b3^G@٣lyCsD@)F;RHP%l(#%/ ׉c}pL23x ɉS}pJJ,@!U$u 0n"L-<%hɩUBЮ؉B+9<6t+ [-II:\\!3dyhh4 (!\&yU/D7e}=_e=j#XU +2cd_׆dl`.,ZN6sV*-BYHΤR2\d"۰ͲFo^j*[öV3|j!XKo/ 4DJl_)i-_/4j5֥%-ZKȭf%5E -g2I۪$r! n[0_:WP ?IB.R_xo*Q X5/]޳dM |;4pPN;8kuht !>D~]O59ފvH^`h}Lƀ4Žar:嬮cN9T1؍V= ݑD5,J9,ẉxFAdv- ed6}TEn2͛$_Bt.(*VB5ߏ>WuT}ъvJwofD' "Cph9hM#G ##)2rBPk8aZk 1Plw|!v ^!ֽ/cF&inKo}C 8XƊadQpž!1(Gr:O&!m|CH-K"0|o $5*<ФY/"NPgj[?\|e In5>϶$;tد+lOsx=GwW.@RܓZ<}NXnBbY>u8v1>{?0^}f Ul&nzOܟ9/_VBwA.<*fW4RA1qB0";Gsn6,Uc9h4-16C91"y 68 j<&(*9(!ASYHl?^'kӉF3|iEÐs,eBr+ξaPqR A(C g`I BĀ̈o*"M#f9TUu"F{.!:@903$Պ{HaAcikh[Vw@Kj_΂?RpxPۀ*dK!0,-MԷ&;@< 6tD(u:$Q=ĿrլWhI]U]5f$-Ǔf8@- YvG1;m9"tqZCX:]O/6 N&epnXaU#(ӥ<_|( 1anJ܉cƙmIvnU]i9]}ؕ?qϖΛҙt&g^{i"q (43h ey ;H!6H1S\w >҅19HT"Pwxκ3l8K)()>T)h(>i3Eg%B1/I/!OS`yk<<疱%oCvu~z-oD$~ b_ʶqA4qYH5~ħ|1h-YeJV IύR b$‘IJxƃ}M][𸆮eô]n}8=v,YIm68ƅ O=v"'#F̮:E1 C9 rrLdvԱSU@7/;(,ej$,OKUI2#΢ҺbCt^"ځ:hHarcX)Y؆`?c[ GE+E1\ QAJmyR"YL  a #iB[ ac 儙b 8Y_F$G漐i%{^)#}X!\siscZe. R!^II<50%yTcNj+eAkDGY8UQ悃$2X:0ƑfHxK [] !L4?KߎĴ[E"} (dRz= {m*I8̔@PZU{ 808anR8I{h:vV&w\kWpQJ䪐VbUX Po[Z͉y7vʬco$(f^Aue$V XQc|hˈ rkF$scdmtmHxpnKؠ%G6D@'$ql`!Xh1!f{;}aKT˔m3 H\2VW.B!0B:Ϯc-vNe}ڸXK̚v%E` $eyG#mr p\sh-~ԫÔ8ly44tC}DqksetkV'&?_*iD,j!Ƭ^js0R;PN<]kC֞)~9TYF nM&A Ĭъ<xrcX]Rj Bʚ,Y3s@XK׆9wfy1G:uA1ǂqWXO(aG>lʬ(|l@/גmMm4_9r1%3 z4W;+-I\6r쭚&hTЖs^Gck- u4|gͶRVghkmKpd43FYZ_Ni4E/i7QswHmzBj%\#3tq7uo0_re.| Mꠡhz8$ $W*\5IUD"(mN!Ќ%td Ђ2R+ɏ5/wm]bឥ~Lr|>LҦqb&m7XV`/}㥿ד5Gv:@Ǣ^Tbⴵ-W8hbܔVrα0)i%3$ObXH^K^dR [8w2:s.<OA 'tm,EQAJo@ kD*stU.*⣗ a5D?Fjpj$1SϸZCC "*&>߯~]l\tܠ4CQ<(i㝱)J&}RaPU*=F*5Y'%P'v[(FhzY?]fq[/(is-x7Xv6_*^ DxH=.ccDױ1(L Gkq5%:A[*Okւa6Rh  ZGTyE RʖV) ;8rxqt I{+$EBlY9)*KQy[ta#\;,m0Ea Z`ӡppirYh8c`ɾIR&^Ev^Q, RJF?ZOLDN|vػmPeXmɑX<š5l89..e=J?ZK!xq3@`{á/6fBqmI[. _*,SqP Ԧ ؄&<ɂi gm`\c0q3l"W6̀w7SPk7H'aR鏈plniJ5A0j=/ZM<=j1W-&nk1Jkxhf74,$Z bO=eEIXm䓎KIG 83~q@r.-\XXap#jCwH'ahShs/ͬxzmջWy/?_s6δ,L*21MK;ky5q r6A3R}YB{HHhJd 0q&rfʌe)m='Q,fz/rVwid)!|DWk3զ̘]$M)e~.Dҝ!zhn:ВpWUFg3ITEfw/ (oMYnΰ=c{ơM'gK*˧D} 2bQPF,3^*Tꬠ_f}?_t=Ӕ\h>{㫇LKNT>jPja]T}P%O9*hyiJM*}nw |#q Z2CY\'( FT}ZArj )#da>7]F3X#c@9(Qlϑ L?Q=y8_v0_&*_w{4و/'X$M׻oM./ouJս_L'?BU$ syfGķXӯ` Abff5I!KVqD0l.V)(xvtӣKq;=J(GGD'ŁDȏޙw1Ht" [aPk#o,8C$(P 𡗹U^%|1hrZTBfhs_12w(sg:h|5P-H^*'px2ֲqˋ yIu'XRkΥSgh[Rf8j 4  sl !gHC4Sf%6d@cId θOک]C 2 ŧn"]FZ(5{ F$I]lnirw~\*X ipe.-`0@B8&k:r AU^[BK'*reJ(rkI.5'\94:ZpBi?i8'70FVX1zM Oo(R}6+>a͋~J*\\+prռRY Pg"WO~klaI.,硗CaKɬ!+$w2%y?1}K] G9ri]N,>CLj,Z,:\u~{U0?йk"~\EȻ6T@eK$\^"1#Iִ⽀֮ o죅aefÆ8T%l b׮l#OaS$ >H^2&}?-"hD 3Q(RR(L׿J8 콡L; B^ 5z [>50b;cIIr%ޓqd+^à:?y;&L V-$;̿).RTnR6VuVg11X>lKEf]듡8(r@}E)F@Z i;^? Wa败!͸d8i< pjVqT($ZS$20o\cuH"s`c@\QL/ ш6W1ȠIx D'r:Y ,ff Xdt@ovb1kRk& W8a^1CP(e q m~fk44NFu6k hggq\>?M3En  (߂:THމy4bK /~6ӧZsD2('[mQ?nIOf%eQ';~mDg/>m~=Y9%Â$F$9\t˗߿|H<UYxŝ5VQVyUxJ?͆yLE*Cy'G#tN6̇E0R$CY9yHEr.C=F4[) mXm@[RKF8H(VsCqJ_T{sŭy\!e^zLxmkA NJZG½#)>MEhjI |;.=%N1WG@z^HM$y*ur(P[rS[EIpScxT.~SBvz}\w}7yLsѽS1i5O/B=P_p_7L#Š${|kF(wM+#3.u0 FEYGITi`C C g"?P` "> HMӵ9ү/:#\LzAzC9{QYCkG):[t9mKWQ5k"< QZt "pW@VL] Hy? 9s;#\6D&\@<=б6lR:R(E$ ([C)x9G]<"!IQIcpkDq.2^+% %,٬b JGzvYisT~&x0QR**I 7 vSdq9e<& 8Ǫo4~E|jyؼOz}hY!5)+g)/r,:@og%U@o4YY.ThŜP@o6NΎI4mo]tq[L{ZS![VsSnb/~~g R.jv!8fG7eJRN;H]HS`ҭ\|,ZSŹ)T[)rXm!nŀJ.8g5b s"j]EOG)+eMvr~uΗ>ODYIaw+59u^( qɿԤSxtSF7khMׯAJ+-!2AiHk ~k6Ʀb^;upmF`!%Ѵ7m ^k; K >/e/UMB> _|d;a'mt儵(%pt.gK88pB4GT'">oY] Imذ߰@H'=-c @̠&bPBoRV,tQbJc6d&r*$8ddŽ-d^BgEx 4W^K>6wKv5?1eMï&5̨34xdXؕ, $ﺯ2d c|-|ԚP5>|4oʰ$S-Vռ}y 0{E-]^?fhN'ODQ-K"ģtQ׋!%3ӏɁ@BH2k%4ԏAqRr~g1x:$;nr.$Kwt9u^8+uej4BCs& _m,J <0@~hA6U?GS$}7MO[_?~=? ;j^J*lPTsEOt(J Ojo"'L4Q$,!|nHw\@INA6W_RQ1%f[t/ftXQ?|!TxJѐyoCzL#(мe1VSv|sѡ{];̟zxy%gձ/XON討 EGv GŞݵ'fdc{M߮% 9ȿ}urfKWܽ!G)yu]ӠKfl( >S+͝1jOTBk7FCa3yUo ]M!VJi(2/sBe[eG pufq|aF&V*p*t6P]ߩI rA-࠴B ~bh:o9l)1 +l]@ݮTN{j$#LECPixl*J&ɃU*tuꇩ 븸:~:gՔOPkkAa\]{ Q\A&42()Jíd]NCV.=\.=飚K,_EHwũY>= o{N߾_-ؖ?sȸRNKC2qlfޞQBeEhռ F?٨LgK~^wZ n3GoD1@DlteE޻CʑH6c?ʜnd ۾F \όhMpv?~ [edQx>f^ztYDge9PZ>,]M ktYC,jyT۟ݠ:KBӀx 7]F-`r?zg7@՘F+S6ҼpiF37zļ ?LI`.Pۺr᏷H`}]T0o5im>oVlEsy4#D[]LB HÍHs#]붑.x#X(E(dfL(8h.b555F5zk隳[ģ#Aãe?\xRAT־MTTKROUo(,߱|Vb7 2B!B+91H2PBPH1"Oqω2{P2H|($Eޞ}f?X5s>4{Xo&Vާ>NpW>>rhMXww~Q?W z"1EbЋ&.J0sE^jeX60+cQIUwzkkm$G"ezٗz0=~YHUvUmy{Kd;ub^d[]J%q^"Nx99JS%rC,%_mrӲb*x] 0h Uqjkg7~x6J R'`R6>j$4zN1REz^Y1xqPFbVABv]i(t̙(ZfQZ42jwmy"G/! )@ɻ"D!@;y.-Q"H571zV-\ ~-y{ 1SsŤ0Zpm{ջ$MXnl DèDCfpAj=c<|]^r5zi Y/ \}ɐXC9 [ #Ete0 *vaSNUw>qGVh)ǿoǀ"dߑ)SHTy3K6QŒi$ÄL 8~BWtC3\KA>ph `[+0bZ K\ՐW\.谮BxU‚0aʊ.3shD':ʰ4Ɍx(`j2"Ky(Q >݌_~A )1E 1\ kФrRzPۇë6ܰkmfU<\ ^LW )RtEy^};l $o-d>ϭ>!3$ɑy  ;%{]>'Sw .l!$7u@By)_I7Q jsY, gZIiVt.<2^Rm9WF_!3pW\UTE!V(i],7OmHUb'1q|Wq|Ws;^ J)OxWK'l,`( #@ޖLi"3rWɡp e6~dU|ҋY39路߾~*nR[^Xmɚj18OXǒLKL$܆Fsn#1D9H>H[^NL9*H N̴ ZZݴ9 % %K2;qC5*^.9ɣ-AX᫚"Thk&!ktww[8] X;gԆ\A34lFrzphe _ ftKܓ1pm7U:[t`UQ;k v `T]|ٺx {DI'Sړ NI!cX. 4\ں: ^-sK "ba;QܜQe4p!eچ93%Żn̔`UKRUZ9q.M "zVKn|Kj#ख़Iz\XקO~V9z߬ A]3ŸU5ZBk H}x[п OVTIaM+&P2`$ X׸iw,^"48X2R'(d(NP(w[{U}w~oॲOj8\TX a`/x '(Nޒ=%G]и~я˻lLP%$9or|@I;^Ox,a*y|zt0wwc F|ԅ`+(Ǵn7G/:D%D9#W ] M^zpGm>ZѠͧ8ˣ??ogK%S=( T32B42/XYR#"(RU9hüTeL#Vag:8'3ϰ\ލq)T*qP-]p\2%VzEP:,浭 .ҶfBV) TSP? ),c_TmlqbpXr 8V7(1jYKyw]v hWEzRᒭ ceSK Kj%^PRrߧ"B':/i~" 4]-#Ztsvv5aMj,5?gt7X])6s2:uit_:bPW)*PNH K]֞ {E4֕pJ#]h&0'rk5z՞ P0z`hPg }9J [}]?M)82dCK\T` . XTUz-B_{SsU\xG2Bv8;T/#ux$~t9 8\8*'ThS0fʅ4tQuo|U@fE* 3}v_|4! HFs6^L#=9vF_1E9< `x5?P CGu8{(G2%3qPёg>ꮣ(>ι2~8&gޛv>2D(f6m%K*x.Vgjm52ŃV]%Xl,Kq]},f_%-]u(6|Dr0ib ?%ᮚ}{X>,>^Wfbq{8g?~i̇Uy?3]vD5 xpc1Y| cg9]nJjũQY^9$C9wT @J*d][i-!1;޼? 1B'bXshRR2Pio V:^kVWn ^Fpy $Z7T]*f<^0]Cwc<lh{qga8FN~ۏ!4g $._b|W<-//P0QY}}wX`w~gaag_>sB'Zi}1)]ŀ{_P~V䛯bL3a_ë\^dڠ1r}rD~BcKNɘa8m`J~4O?sIqUgϻ=A]U[ň/``峺ȞbZhs*ߨگ_,[bhNk$t>.0{~o+?,'W~[,V߽; h6$w?oH"Э+M!6!{qg2Njl?Ŀb %5cCl47 {_ pd7**ګ B)HL QI*A(uxĤDܗJզBK_!~Ԁspwk܆.g?53T0>jZ piѴ9[|Z~^|]Ə %dq|uZ {*ҨĎsAZH'_g^$ת1d (= ѳmq}(nZ/xUb+wpeP=qy /Nozqݜ&K9)=%k z ܮ4LEqKFjj2ݮ]k3buHq|Gq>0uZmo7[J cBia E. VXRDTN SicPpCsf)ZxX I<,Hgo8Ȏ2)!s(/FPhę2}l-b*]@*6j1kU <;N9{4d'C"xD [pݐ Ay^L<;!Sx?LQObg͟K'F& Kpw"Q1zDR& 3ұҢ bu p;]BXMRax ꠑv`+Zu{[Z57ܕxml+ыM:%P.8v29H@z7l$t2cCבqͫ@s-!}3%d*c)a[4͵0ѩT 2r }h i DŸ8zwA%ǏKM",ŧYܱ㹃Ub'1^E7{UtWm7ێ Z9TR( *N^ZドeVp!k~܆"s2q4G[ġu8Hh^)}$縗Pv$&guÿCsEEge~w}s#,iF#4 ieK;^93Ç|kRa%zQRX݆1MDlY9üfLZ8{uL?l*Wi s80zuW" uv'^]K Z04a#@iR°|G~sp01)S;̣})S-F +k'yyM 2 R?o, 74@>8(";Vr>9(G/t@O\?M@V{WȐ[cZ@;eqD$uN7-ڕ?}*ٻ -&iAM< Y2xg%w6xM E]EMze5#J7?vj#$4L.h^3o^msэeS[VL&RL.0Q֖zQVT&DiyEE~&y/vtu&wdRm@Or '{3id%-^^Ŕ"q9d|Rr~Sq$M𥳙YD̙Eq,-G:Pvߡp wDZ{ Jvax{I.&ۅנ&(D@oc乴 p`e:DtP֡n%;CP>@!Bnݲxzq4z Q Ftηihfn=3&|}F0=uoFx-v Jek7lP[a8p >g~~ۉA[=Z&(Ѿz0.w"zP/^]Xܛf3K$3lY[zEWa^GD-oř{'wshMvZn%ЏMa[ Qru<_^RM[jVN[欿MEVdf-!qcŮ*?K3fTfĪG]K+UɦG^  cK|ag ]R:7nn8q[T́h ՖFw4 U ZTǭ`VqK61{bA%g Yyh0`J]7hlMgӻRF#PY uäSAhtM";;tosըy9C^\=AKs{v+HEuU~zFaS'W׎q^iEsWxS^qC&|;?rwuw g5h}Nߟ wyrO'\<~BoS\TIf~Yx3y_kRPs秏hgɾs)h8b6RJ4M-,ƦF&Ĵ{$$94%XHn)}c*%[.;{C&ŧDrsV@?kñ1uJ%BSq!_'PGDH靳܉FAYƙ夣h B,{0t~:\/{rI4e2E4\Ɩ`&]tIabRɹ7$KOh JLQ pY+ G7J\*UR V9͡ :d\@V/g;jI[|W= c"vjcwëy?pq╇Lw7o; hdҕT\W6 Y րj1V{]k~f .KʈF=eeBgӲ> T]zl~捷 r*Z%J+鱱Du,CRAc+i @ Kd1L)[n%6l S)hsNGoPbp'IRn:wG,cؼ ('cC!T'DhF;wj s/U33*(ozO p{&)pw>JJ>:P|8}T$1ZK=pc\w8vt "^QtXvu].p 8uSEc%cUmLSV)5<`rN9՝j nQ/ 0A&.ozw7C36mYl/&q6# Z cێf[NM+‚ 1Iq&R~Gpat?h&s8plLJk\8"gm 5EzFR4zѳH Z0QPR{Q5;ml͸EG4\kz1BvSҁ"5Wk(ePݜ@ \%N 0ރ$] T/eBO [Izǥt\:Kǥt\:kWQТVZ#|uUD_-J*zo,jԶ`d/BV ,ZfBz\X;J,< )v" $^`JѦ%~lYgm-"犓H bS71'F0&J8*ޠpaf.lW:P+Lgi9b : d}X[%_Y7/A@f0l[eΫ-JNE< ,Zd8,,)߲躷 4 n>\foI&]b\jE FԪOP+ODokcjgLEn/,$DאK̵Z0Fs¶B.dIBLț 'qum: {4d Ԣj8YmxY> ]wܛNL1@ƨ]>P(ޕ|Rޑ/!"SVx0V }{)HSX=8ތBܑu ]3 tŠz^??Kz?Isqs{p~|5wg.ݿ_\]~ꊞO\7QB0?}/k8 8$FoEXDL18qt͹CWСF;rV+QBr2Nk-HKceC}JTwGLP ye򾄘d=C)8,QGZ{"`tѨFӦGm-!:rp*JWQRDZ!5hHU ÄᨓP RܭfH61Ɨ (BYdNJqUHX7!6RjUIL+'veM,u$^HIN1Ezn DҒk!/kGIr%Fծՠ)"9ĜN/7dοfBRFisQHIo.>ٓJoHKʝ|q>0Eԥ?<#,g|\V߿{W̶:_`%G=fPumZ-ؾXu`{.Hr%\iPkP{3M/rE~0vC{u kU: Of7٢篆S%$3u׽|`wa;?[qkknF=#~qRgn\v.')\0JTHʹl忟 % ɡ 0MW.5n4H vpZ' GUջ4 ƚKHq%I&>VLP#T"Ig)$oK%Sf"@ 54#) ȱ}erZɴԾXs>j8V qL[et  qSK5(@?+r/]>8k~)GK˫GHhIkWS`گkW¼Œz'7W:8|``:h$* IDVR#-(*TeTs(1@(D,5-p'!J.A:2).M%%]``Rq "]GfR`gDp[9K`}vyzw.`F<}abh,pódo|Nս|X~0%1LUTKd)Ң4TPH)Rs [hو<&8G'ʀ=^ S*[ " S0n2.tX<6S~,qR1M`h SS3XIth1FѠH{]@<ԴOq%mRfR` r gow1bY($vs\cB5sGx^0^8Q=M-V~w+.SH *M"A´͍a|Dviqh[my)<DP8D1`kRtf*)m-;/N=ƂhcGuBCYK`7Q|"j.}~ǂ \="#4` CW~%iIjix6%+z|Tw=C$Y}%wG0ZsxwrR3b2`uԘsѫySFw5{!ɤYTj%NkQm#Ye_043KGv1Uͳy$ ׵;1Zy~ 溵7׾^^;5b9& r"N(#!+ NYQ"ݣDf%J>XX!j7Ƥ7nkc |}2z&QhF/.'v=M/{RWw `F>8JdbEq 2X#BBycHY»2mpn)r^eP1 f^T-rt?\U1QIFc7jqv%Bjqpu'MfϡI=LЂR2#vgEƉL MN>C~Jq;=9'r$^'zQ3Il[cV8CG4#Q NnQڸr|k[U#T 洞!--":_;Q"ԀQg)*3Few[vCoH=h4|}O#V̏UdޒbN>\j4]{<~zp>#H&J pA{ڶPT @m%l}?ބJ/4bǂ 96~ ~]\6U$*Ƨj6~6 -s;f}oìc+YKBQ6%$jS N~/D 823So$'dv~Ho@$>@i2hNyj[jcKB aSS% .L(ɚ!)<RQNcL6eV,1BRH^P*aLmR4(PR- F(0:Kb8㵗i|X s(=np:rQhn%<8 Xof1jО?VGpYXyud]`b^堘ӫ=6/챥0WW_Y=)2ݬW4S47?c |SBA-A/ˋYVhAc7=lZXۥ( `5#7iϽqQvbm"kEnͪʺg$ E4HX[(.:v; 8;BV^ z%#.5⡰Ԕ(N\iaT#])t`H%qD@ MC S}t1h|Kzak ^=,9O(NLo4?ܡp9]IxݶN&?BVk$v| +9"rB-7)orv{'?F(7=r ]aF/Q)͗}L!w u>v{L"<{o8Ecv0ނs]/ D\DCdJ/v EtBngm`-#A3<{ yO8 QWoZ fyMu $j1)x qsL/@uO0p<~&(4.3,/Le&" XINt/v!,v}r,wV~Ye-jBW/^>ϛrY[_[/r9$3˕)b7 |^ZԺWֽ:mW7fe0M͌Uʳ&W7k/Pd~. !]b wfqUaa\g/W)nBg+~$(u0$Hp [ M?Ad9xUs{}Јܫ2_$KnW֗ۮ[7L9L]"D8tVP׍30H#/.@ *}fm{=đs^;QYPּKg=g+_0k r=1 _2q'JNii&A^XIv2_TҶ~pbxӺMW?x)]]e/CW9b4JbeL%X`!]N<#1-'ִ)DqՃs4koM~KWO늷ykg/Ѥp6dSkMM6md7u>Oqx9GQcR湔.p&1bf])UL! Y9K[e'՛^/Lo`[Z׷.""A$io^_X,Wk@}gwfrպτwkvmsHmO"/ 6L/ kR5⸽kGҗRMc ۊK `eP#?)t\!Y)K.B&oW6\ygK="j{k>O*A-0Bֽoo׽vu/ؙS `8Sh5`# ǹrlFF\(tcx!߱~PbbP3_:SD"̉#AR87ZdIn*&|'03[7G5[Nslkڲ{H,^KmQ K*hE{q(믺tcY -[ݞ$jAOc]#:]\@2A< p Q#sY8)P7*O'q/! D#$o]rڶ ̶Iٸ\}zOCUNڟgRq)Ef'C׫!lU;[+N?&<-hP-et&:\4*Mse)pH!|_zԶwbӞ1NW=$&E&NtP\Rb0.jjeHg_5[o[Q7'F3&vR墝YuZy~vQ`Vc ['g c?lbhnόzQlDџCF~r;Cg阪rşaXK})͟BzDcj5jӦhRdk!R 60i"$k2SJD)M/J F;RJ:xgVH}$"]Y7\6{?InR!Ks,2E!/,#>!Y3y.5iK} byu&. *sD]R?`5BQ.AysE EfD@ 7"Q`cDf)‰Y!#\ 1Kفj.oAH%%.Yy[D&)E)>·z ͒72\ɛUA8mDk Lxt(Tc'f+CB'Ϲf{{$DAZtϡϡϡaY> ,V 'ކdže4ܑT)SS%8AF 488;PTp"vZiu+ OB|X|p5BJV;Bs3)_j].d;7yʰT狐51R$B/nΧ{ ˸?6ڪ],wC@o|k.' T_7wXƙ^5fo/""A$)GAgg13'BlRw&+/nnyOmKn%5=cfҒjrx{wT Lrsw&DAZQiwb6_.=UB@ZbUڸSB&HLjPЮQM)RpNsC9if )JR )ĝ FP%$$ب @93D./IVAhO}\ AuC屖^b`qiPLV't德RͮgWujŒUթEcUjVMğ(QSLKZ>:3wQKѡj@,ӺqIn/w<:#wd*u:^G1TAAhpEģ҅&b\L0 3F~а ~b{5g1989hdQSS>Z up%TIC(sq{^ {Q˧WwX`e|=E-ɨˋ (ӔTbhñ6Xs6ݙG_Bf,OE"_$RxyŮ*֖$_"ŇյjZULo+[- ZG9<z0r?t%]\}Z>'$D%+$zU"\&'S4:}}M[$ -Dro*τ Xߟ$oZMu@1Ob%ܰ*8.5 `_QiqgR{ \-CO>=Y]Lhz۔I|OkXk"p) M̓_Q8 P5wC,yt3`kE ]QI6Z4̬[ҵ xxgIup4C 3.Kihjsg4B.JZ4d`-ZZ ]Kr(koEkFԮ٪Uuǘ5Z7og^,l`9}ggNCE[)R^SN-Czm1AD0Kͬ'O`םrU8zgieݛs.}GxJ%>3Hw&p;=|3qY0 ?I4ЧI>M:ia9MzC6MeLSEeV.R`c!B c31唤)O~> VB}5NAэ]k3Cot/:Wy1`ٕ-lK5Ca[ ^\}OCW!R3W0ǚHh ?1G4#eQ-V9Nmi9OsgGW.rX`I:2- t@Q; RPqzҁDPoAV|?v 7 AC!U,-*YHEw+ Bk _x`mRMA檱R6>P8IᐸX~ ԁ+.OŚ)ܲ ҄4c]&_ux$rc%JiIqi)@"$enw]] g)2^h4u"0fHǽ[&dfٮe܆1nG3`MU9;1iұcdi * g1Ok'EUbuVz,$u#kaO<$rw҈*CqHREdQg <%F,7.+j2(ɦ~O 4Ye6 I؆g=1vIe2d;cE >MM0( ݻAn_{xi# A2yuq-ɾ+zIQ jpFքlPgaZٙ#Ym)*_4s,GlĆ}1t߾ @2Ge5(㹐6Oՙ2O3r.VJKjEq;3Ωڒ5HL$S$3AX  CIZLTn*e d`*+dFrS^l{hQ*{ď/YUtbv]6 Q*qc} >P+-ˤ5R!`+3~Q-J%X1F +͈ͥ?(WȘAuoNkA]CDN@MfWFa +G k乨8X ` NWo_U3}Y\Ñh=^GD/BNjۡoN{9H_VNMgn"zz`DB^߿{%)(ĄHqzsYYSW| g "IP7 ~lJy[2wy;@UͿ݌fÿ3Y~qs{w"@o/Xs(% 0Vo_z㮗20&pA܎%A [ػJp/TϏG(`?G ,Sw(x'ňc-c>NA"1x$8QJ¤T>0J(K("]A~Z%J<=Iq$'> =,'nR E6͝ L T Ŭ"NgBPK%&&E@q7)"?XPʴB=9%V`PkE!QV^c{| 0|W hzb%f4 v%*` `)#Xj+yz uYE(B5՚?׺8՞*$Z"N!'7ʼn c G!A[ dsm sԥY'ʺ4A4ł((KI!aH# Whܣu4L> mܑ]] >2AblQ#/|WefY@/\On,ܺ^ ޻ O0ōI{i'z2>_{*a7ڲ#ݳ/^_v0~.߇VVh; )IbD%9vԲ,9S.Hhte&D|ɇ!~\\2KL lf*̇Dak;;úTo6m x&:{Dax=EX嶜{QX kWT{)osG9A> d'zti7z9e5mAoBΕzNB~^~ ]t9܇[-S(y%a|氫QƋOL1?aDǞ`}Ğي=A=FeO\c/v<:vHBju"{Un2c=*r'qp/"˧[_耣RsB[R:fԬR~p3z[v-fft[N ,l隅~l9PhNUgb@lO*ğh[^юUʺem3㓛O3:;3 9P4gm dEǣ`^ѩ?*3bB;TF1a4NIRѩ ԝ46}l}Twu53í(4=‹᪄"/],ZXe߻WwOu -j]jw4af10_ȰIX2PB\p5$:$.[Iu`l%]p NV7ܖYA:e?fVZqMM). 8?jCtr=XS1P\f2D`r[W_:l[pW%̊1{7}f5/bXR'ʿNܬ#yV੢kU)9b/{SܖF=9n)lSlRRzaܼe$䅋hLRɾڭ|CAĎQEuh)*iMW nuH (R0L5FDI٬_ utMéXS{ ڤb\bǨuꁧR:$䅋hLI(.E\:nNU[,%Ki_d3P!!/\D)ʤ(8k9„D+ή:4x}|݊O_"LX7 2+&_?YC1>9-Ji%ʰE3)j2g|  !jAUzx M#z;$w ;FeR fj|8F )bT3.6i{ -.D`34L hF :G6.fTtn>Ґ8"8#Zc 4DF7E9E2gFxƃ0Pw&տ\rE [#2*ֲP,9j HtZ !behV< BaskQ TUTxD` (`LLxu4݌сZr=f 9Y짏ŐdX*S~DU^ͰFXɩ ܰ` GGcALJ%fA t"Q5'/ȁ\d?0]/Ef3ID@y7 @*đ'lc˅}a!I㊄3H A$= 2 lRSLB%3%IiH@t gl! U0>wBkp4C3eZ0邝F#=S?FK'ى(h!=Ek!'AqsAFK*<&xS(6,\reNrk+M Bѽ҄Ӂ {E]~n4AG3a 6Bd.UהAF qUsKb.7!+2(WZ[) pkV ohsUQ4cHHZa4H0[Balr j,TY!j5Ŭ;W=ȄpN~ d@f*.sǎ!Ĭp EGcEĞ`l: J1c_Xa `dL -C:>@\/Z,̹^f7OiЖ]ѕ8_̣u8Yv:ߔ6Wfr0.L5 <&'LY$YVemRT`]p|twR%`8K+T@# @y UN:9kL5oV(=AMf+ّ_j!!(Z"E\0[M GDJ^W`.OKvffnnֲviD,K*CΕ ӫvH](Xaf߯V4޼{P|x6UF))OaB25C_ dFzE3H*3 c$|3펆80E."9gTy&EӆKx"SQaB#>wM)g}}ҼK5Y+\ҥt4јHKtN Fm3zo2"ȃ0tHiYp^jƺu>Y V  $rIl1r,n{uXItLjجm$,SKP)&5 E(fťqGXp)(pT\p)+asvKc,FN*񣣐t1љ1yZ518Ů8.EK/}iH>.Kæ40elRX*"ON]w#*0m^$E3"OgGEjBL&(ͫ[%t+68T㵽"sq7|\m75->sL&SlNaACvN 'YjU3 .zEK63sǸ_!Ϩ<êg= N)A1:Y{,V%\ wuǗ=cs;]+ɱ*loDcKn[{_[#BV&'v->!3g$uyr~0wVSmkeJ$?&v85[>aPEI$_&wЈY-WuB8Y7h>'} Sd$E`f )D C]B $8 NʹTi,+PKu%E~X^( Ȋ <=t5 ˫uTMOLNfKx2zt)d O׽B0 >I>G9u6Hyˌ)3+9L҂)B ,r g+4nA;)җߘםx'BgQl;U~=S4y#`cLᣋ]=thEa7`*x)p++h܀~rPH'WO&#L\I=8y0DkeѢ/̏_$2Ntժ1^FZ=ƞh`\~gXn6(t@%b o&-ɯZ*/\|.=jpcbH㻷Տ M(YdguzwkýYNC0k2>%y/JCg01y?Dh#H; E+8Gm_E0vg۷{ }[Q=`snB23l !SZ 5W҇P`^/seJ0AXgsdgcy.@} & +&FW8P/’.-S\Gƍ{`qF dyIVT0e,Ӫh¡X`FJ Θ8Ӛ , 7JG>f.WXP*cN*ϔ (EtR(Q'G<朳޲4ĤV :cku];;RNPtD*="1({Q08뜑;QSUQKd-znQc)&wD2F88ӹ@!W/y>|]V{77v 6>q8NxN?l~=3W&:뷼\%bd} d7b<֓{ /YQ^ Z\zP\K0Y>n??’ܜ''[c?L?6Ccssx1DKd5.zI'0_tK!oԼ}( T*CUޓt[^#6|AZsuG袍u?՜`S,z&zϨ/]ꖪ7fBiZtK2$6^8mW:kp4a& -\{LXa Q `<762uKE $ Ȏ#r-_R~'WtJRr62=`-_Op):{ۥ r˷ٻ kpk\ֽuxç[O溱v95[3IΈVT~N3rrVmM5צÀHpfK5*1E>jEU4S)lK}!x퀉bw#1׵n$ɘ܊#dKdLZ(?Mf3c".=fKz#ïSĝ93_:aanj)Xj5Ӎ}tٱ־aVG07`kjǨI |h2^zLjI):D!0w(|S0Wwitk7A~|\oۘ,ύ0ط5Pv[w3}'XKzba `U(E`lgdCog؈!qچgϺ @ޕjXc}JeXޕ)  {(v!Bʤad|-@( Wyc?ïDN&&Y]^CeR3s+qx!N+IKLEt$Eyd?ٻ&9nW:ūX>xƆ-ap!g=$%o;꣦Q*tQ'C e"HdB-뵜 gȠeΙ˝I%%p^묱DlMpB)l/p57~vž2YGs^g''g[_n_2/q:/|;#^Pr,?gqj@V@SkMېWO}-WM1d-ѳ@CwvI@Yp:(Dߝ*5Vo9i>aPiu1 5Z@AD18{ TY,`4RA3kerYj)Zz y6(E `HmQJ{ȒW '{(w\N FJ-KTI@$9z.㵜'* `yI@D 7ff-pԘi)4hH6 =V!VnDD_zu L~}j)V}wh.*#~.(G4܍g|p_]W/_?{_9^-I|W_*y}zA7/}X5?ky8x~%~h\u 1w˺S}5 &Rcekyz~I߬hq$3,h&ёv#Bb":.hO[W-SklBSp5e<)ግ~g_&d358Z_Zzp \I_*Rm{Wat/>ޱ3$_;"1uGˊS Gh -ʛi; wɝDk GCUU+b?C*ѤRvI6/X*tu _ѲGj׊%ZnSS Uku=)2sr& ) >UTFJ)$W280`炏uHsgj^+%*Vs SۊEJ*+(v_~Wq,2b$>0aR+^S/\Ia5`G[2QQ&XؐI簎QG;s5XCĢ&p0ZL/Iܬre \_g^4QE#ׯ'Y,"lY*"@קUJݹ hгħ0޶!-9Sf'TamCvKgu3QzMe1ޙn 줽$d[ 4 5|G$_43j$uNn˒FLȌXr-HQrJclЄJ%&rBL*:x8 U֋OT9ڭgP9Yh7pyc2)ʻP=qV`,ERY("(0|.roy(f5GB-s_`.c_˹xtlKp`ԡFOﯚH;Jz;},onfފ}c}k.`/OOYᔴ!e~U~k`Rxt˷?;rb8WؖIKU&-mq^SKTKaHzo9ˈY4A>Y}uʕ+)%iͅqRT3%x6 \9MRǪ\mr<*Ĭ9PUV d+~OBãVթc cHg.2Ef/ZϪeE9trpnt}luEJ:zv1Jk\,nw鱔$;1cA17?.]<Ӛ{>lZӆ& X irDz8i=ip~\5䑖qre=8#O/|R bޠz-EnX;gv0-K*U1Wϖ[T)ɿU7j-c5mx _? R T6e=)ѨGPʱw͹Ddsn& N5z(oH䣋{e2qY:!|U(#{!xW]҈2!DIꌴ@5g%d6gBB5BuDIhːbuo ϱh J 9m3VMpUé1no`z '9CNtDZArtޛvRDW#Ӕ?(2g+7,apnyx,+ٝ\j)Ωs:H}L<2pT*6w`,u'+ދ1:mbK|~0ttK] '@`flڟzg&jVMc꺤t~Nμ4 xȜAWx:`j% KFEGXho&5Ȳ٨թ4n9t TW9=rD⁸yؒs 1sgO@m~]>¿ܯ]>*ұ(JM*ԓ9\ ݨUt7 9%#;A[7.֍K}{d`4;`j[p dgdN}^5Uƪ$kE !Jh+"X^5''׊` rЂR&j%61R4bzc<5;!I !:z9GTIZ?:hٚ<`$4p,[N"BHAp01p ":2j0<'E;rj/t'',_!4kŵth*(3y,RjH1j N \m9H]=pXnu:N ET2xD*XK`?Ҋm --eZ핅?R\ `W,DZu0[b/.}Z^tM0?p E4KZz,j70v EtrG]\Mݲ MncHg.Y20Yh3~TbgѡdGG.)2} niEȏ'cHg.e g1](\;ldՇo3~*K֯{ys6YPg8QQ!8nbJG*0xk:f}/_@ sW)-ELhB$PAMo`Z_ /%IGGHZZ͡YgwyI6ЗO.[~*<|ȃ첼5Z_}QsuuryU$%I6ղ̚M>r3Y~z(c8"w :Bz58`%^ ^-޾#&\sH.߯S~]׋kuUmF];3YyHnL&mL&}@6/\ )V|5E +'UV,RRk.@<^KVQGEz̦tFxYhϿD7 9ִDwj+Iw|cΐSĊ@C4G('0kt*oa|w]?TZdpKN(Ǿ\ud۔>řyz#UQM6=۱v؝)ռ$W\.R\Hzp`.ߝLNnmm :&q\.|1:Ypxoݢݖh!b!T9W1Y ݖ,q&DfY)C!jFNF$>"a3^3CV fք ݝԜ%*J14{" dp{8*1#)e Ja!9EiR]^oTyxnnZc*UWK',ߓB"ʊF'G1Ĕ)O ⣂S4;%0XcWO<垩}؇~͈F]?"8q96ɜ1ƒJX#QqC$9!Ӿ(PʃlcH4SVV Gz b`Y5q*JT1 J"шu ɴtVrG¾ANDL`5םx#XLnpE%Es1!Ǡ}?Ϗxxb3.5,r8ֲX'j]0ufMEiL)ԇ (R1|pO8g5`z,_O"eoOۡwLEa1g 둉O,)3B:.ȚI|RgOa bdӊO=G| /xBJzR L# c/ a*ѺB}@giL5CQs??+i%>aTr>ʅej!|_mc t$do~|LBs]2ll3ehc o](7Z|j,E2|&gGomfQ>j0HN. *μKôbsb؁l;\@aαڐϿo-ePNv0MۍǺw.jM S(kMF/lie"#FWk3Qjk*J Oޕf$5;"[+e>Zyi FMߕUu<rUƑq/#)[c;l`}bE<;A_x';]1Q&$/%3O[&d& r} U s%C"Y` 381y#?R m`w m"ƴ1_M#hZ(P|`cj-j U4bM]Ԍ ?٫ukwc{ ER’=&f9'GrV ƻ(5 ZSF٩}CBD]d,EtY]R<$E61{ 8p&LZG7"mr-z?rr篰?ϧF7 %.r՛oW.^_~p6[ljtAUv`|Cy>mC BmJP1+δ4%}AjVX4 " zj=Ll^Ƀٕ}C/S˵v.vo!?V~wF |֭ESQm9׾L/gj -n 4'W*cZUk6W{^hkyPcWD3Rm˛6*:Y1 ^_ߕvJ#xc%.{K Lˁj3T{Wv +zT[ZdF)f ӅZk͓?~~2A\OZP4#7Mf;&ٳ|F/5Ӥ$f4 .vD}ݾظAË8=0/c2W^H` Ңqd49تp%.666ՑQ%?:Z`+Fn،dWJ^tl()&9H'lq(ԋezwlOƖY3"nh6;\m190\m,7<틋ӾAr3ᴙC2%Q5h@̡uz(Y+O.ytp9)V<%Bb-br5_-BpdvuѬ̑r# 2Bu$32F%W.ՁJ [o.O)E(r !eup(}I *k4C3\$&nD&&SUQGH踷e/H{yM$FbYS&yO*mmTBHrK]"@ПbH!^k72bVPY'լ0Kl##,\*nA[:}Z\Ƙ]$S;3}>j UJݼk޳nJMt@A~cu[/ܢ/Q꽘u&ԺА\Etʘi-i)`%d#%JM1A:US<ԬËg E6}V 5&|rLS++hoQɁo86r 񂴸5EO_˗/kxq7 БR%esvN_kZ-|7p*77w՟w曩9Xc.|v ;Q_7WW} "Ft\8ʵ3oNVGnZuk NqI-_1dCN*i W>; E\Hg8>jm9mG;g}{O}W87gHm0h iJR{q ,3:s.)tLuH#@"Fwvw뻌΁Ŭ FoB@ /TGV9Ky8֕5b)"C~1mN"f(0e8ĸq z:VUKm "Nu (`m_}Xyn/T3f/:R^pFDP'iF=@E!Z$0Osݔ 3}9 [T1|!ɔr!y_!$*ZS$[j9rZqz659x1ňS"%/ݸlkyPb#k1ƍh z{CiVo!?VTڹY .$X+щeSĒ7C' 97)\)osQSLIi/yC G"ggpx+](c TomVꊶ;]ҟF!nݑDPa&I)r>I[;^v[ io#8CC<Vn6M_K@G^Oze@OR]9ޫGikZOɿ~\wo =0G/{XӢaM_.tǀsӱS$۳ٳ/.0 , ?/ {DyE~<̲Xhn 0n -ɜa>59KE+s4  UX&`n?ſwAI٥;XXXU0fwәX@wO?8вZك-zԁL]Ƣe6Si4sU6գX0}ޢ'`gEgG1E:p(Z+ְ :,T7Gp7:)eP9!*m1t$*Y&vEUΙa=Q6fKZGÑ O\XT'q<< Ƣ%iT Lh\v4gc 8cQ +btkyPQhNfݪ?Wm`8 bԛT!²yZJW PW>[şBw uij4K̖̊묠40"N_zcFė" Ljׅ 9 ;wEzZ #;Pr) $⾙`9vBN;q! vc^Bwi*Yv봷T1R*|9{].ڔȗQ x |hՋ~%K*ΐ'Rh`8R d= 1Ɓdlv$r``0El>A$w6vqG/.lNJ3 vf(~ؙK3eΌ(W{n_N GHN[5hOMF/'rYiݕan?$k[4HaiO f4+kx)䆹`\W[cZAZ~:FbLgbq{mUdfånrr%΢e6 #m=53,Ѫ::b %Ϝؙa9Ƃj*#:[U`bBr )[YH5k DP)N]!I@y$cIf7^ DFM1ղj`\+#k3y OVL^@ZoB\_>e߹ignW/W z}[_˃q7W4 SphNx٧]1}՚YIK; hpuXL/S ㅴFj_tt_c%Q[^V8E|OX.V-?zMy{]MO^TSK}rƊor ܧ=bٮ_vٮ`Kһ{Mc__? 7Qj``' @0)iw;;}[ >tV%&I8 -Eb=7ĎTJCkR(&n^M}ww:`X"M W6GLFFqgǐA4Q,=Pm[Ik Q㵢X"4cn ((u8'Ւ!9:T4JC>pA$$#ud֡䀀TSUu׭J"փ,"0M&ɪD1HeZOe\­{7jZjZ4'k@VUZDm_!KS} N5!Y}SJSMRҐ%ђ`69uuUu]=g ԇbu7;Jռ=v Wm䅬goc7Z=MaQ]@ ğ.7|[myPg8a 7Y4g/s&׾mw5K?Qsǔ )nu_0EWC75IK`jR:Ȳ&ߗϷ]WKϙӣf+n(4pjDpUbuV&g1}Q+QzD$r 'W,zn4D>9p7I.]h{iX;5a[:}X/wrH,$̲8eA䘖7.i*)\ %t#6MK b凛>ؒ^#Sgqn4-.-p]\qʁPNfQ}O{wZ|1p=-SJ^ݻ8=ïJKyʘyYU[23.v*Vq> b۞W W>}\ U]wCeiVj)WzDŽ6´BzrR2GiRV%|V6RN* Ԗnw)*ݡYm^Vf)kDM& I#Ǎ(p9K=n<5("J ݠƚq3bhB @=gP`.9KM| 56c\bDWfS4Њ*[%b@׍`-}XD9dv-fg# ؛jI-$k9~Yg}:Z(qY+`QI>A:kT$j( ad-֓{qyhbAM^5*Hnѳpv"䑃oى7 b*q;DG)_-ew1p\)z%_Fvil[#1^W1.٥ j&'U*e@r&t'K79>OnTG$~hW=ƅx>Jxv!Ndϧ]̙h!щ$kjiY"85$^X"rؔ[Ҳ<]xk d/p(&!\B࠴u]p!j%UJӐ9MJ#L Q4 z#QF jDu{=x<,2 %\s,& lT1GT47$- lsDi}LgrnʑlHDٗZ`?sTL>Xg3m͝I mFtc)w'b㱔_R6S~t,lQX'],mRB1w}V^riGDKdh}FtMݙ$ 3vNFe$Lx)1WKXk2 ` 7c{ Z'#bT>NERyf6jd5Y8B9^Deវ-/G~Vu{e>+i4PIq2!Aժ=14|L1#;b(2Iÿd,iX@pѥx`M.ǧ]1He }0P%*c[4}4BE T yQyc_2y2<Š6-@2;j4p9#f%17p; hbk(7RCD]qà*j8|4326=2&&о^%"qU2.g8\rWyŬ+^>5e<BϤ u[! n9 \$@!hҜE)}UAt=C /Į9isFRꝡgw?9fh0ћK'ʨuÍiMJNgfhj塅6 ‡L1h^WwҠ9U:Fd<*33ekKݡ]3g !Xt ^Yw<9 ucNT2"\Pþ-cCbz2mKÊ+>0`%$02J08/miboЃD,=W|V-{HUa/r:#h Jfz< "TdG)_=HeFsa]q{0by r[7DdTlD^)Ha$!UiN8KNűz 7)IjJ \4H W>>"Vl 9[E&CKX\D%d|JͶI5g\Kn-[P 2[XbP%ۡ~Zno+$_R[ G[n$h>^by/ ꀒj=n:>6vEb_-ҫ&y׌롖@MF]d<Xgg!Xzp%c pV,K/p?Ju^X-TZ$x5MkK5)E_#κى|kycVw'G4]6;_G hGS|{/xåAҠpiP4si0+L!K \%=3DZ]J>}ϛK5\2R#k.E=-{1Cn/iIK>3!Q|D2A_E(%Rē/c"AHQ[j>L*Tv?Vn'DLF,s9!y.4PF cx#T$hq[i/+"@ T]p?JŌT˫)=Ѡ#;tB& Dq`.pb_wAj%k̰G6GwQM]L) \dD x.@C=br/)G/)CJQ{ ^xNw&eA7#! q9CJ}&ў/vi '&jL -I4s)/ۨĘ2+[yFl4pEA*Tyצz[EJ RgS"'sN&DP%'u^qa=ttj@V c+AJJJ՜ǴljQA*;|td\TsNE8W%)2!9Hg^7,o,GZjg[u~UAAUSʴQup2앬VEKDɖFJ=~Dwy2"hYR:@Za"F`I䨥"LIBE u;@`8,6[k2KkM)n "8r0 0ӕMte}O 3D_qquiлVs弞-?TnTIoZY˲ˆn~ = ]g%9 .]w9߽Czn6/poz^Su % <#k2FwGZ9>O>%BR{)F7p|.g׼ykvƔg[R-K OYw<8/kE &e^Vg$,9ͽh'I=μ7.`߸5jqț?6.UwcxeY(\Do9{U?.~q_U)#^WF6JO9/g BXAh30H)1ςKҢ1X .c5NsqNzឤwj>-mm(̿yڬy/ߐPK8i} '7^Dg5J0p6Y&cjY]*9Qc#c&D+ޕa'b=$.>BM.zAh}-Ig۫(ov&?{֑Y*}IXbb_v`5GD'SM!E}.$eúu]N!X<;be?+(g;CZ֗kOͦ܁ }`l9@n8 bV|Ր(l=o88Y)`( `%µØh8 Fz^^ PW (FMJA>Tk2w0D@Nzhwxo?w/bIKf 0=M'*-?7Oh}',PzcLX;2\Gz͡wee fV !࠘<π*6w!9q.P!h,'tnj=]v8KӦ탫t>Iڣ|^;?CJ7_dw׿$/GC_qϧ=oW=6ow}w_zmMb66[_nu=ܻͶ!cmՊl7t:TtxlFXO\T,[W|˻>x211ƻ3p챽NﶄMaA, mہ4*4_`J ‘Q5{Wƶ jKEWkA'm&08I)b1d `E.kJ(Zq0?!j aQv:8k+@ (SK!3W$ TƜuIѪ8UgPMdq6ט FkPY8XRÉ`6X]bcfPDX7 0RtE״}:ilՈH2"PV3gtrLs ՚Rm+E\QZy ܠtOYg1+.֔Qs4G+as4;_/SL[K^ fF;K|8>3s]G2gEә Do3 '$3=3xeFg7^._Jk,b&5h?aJu۽dۯ_x7&@glcFRVݼͧ*\b ~\8a +6'.{:pk[mNHiePI[M"${ǻ4_Xk4=rH,aw/Iِ0͔WV%Pim;K@=_7rdKFtjQïûE5F/}0jOh{Kٞi'-Hj{L=RK1#ǤiDy#78RVO?yF,9K&Ux]onӴ̅X6́crX{wCs&*p"(Cl@ yr{W0RE[zGчZ-WfgFdvcU%*lqN#$m1V.~[sDVMħ>TEw23 C[sl9Ntݧ}Ềws7}@;nN>ur~Զyri!GLLwY#&6O\ɘ?bbFZ`L?iv ?^b߼sMm\8{˖"4oi웣3xPg7nүu~|xӕ'5/K7 086|bOV_HTfR[ϖ]-F#Ep{\jW]R}5͑ezlOO1A2bW zSɕ[`9XQQ`$U bdK>dtJ9^́J f$%c<!&UШH|5OxSCȡ0ZYŤG\ "t .H#aYEe#:C6$R1b Xt%*K)J72G`0E]b=GdqbYðD^XuDX`4]!jB~1Du:V1ILɺ JfW-.[b*W/˒o߯ y?!֊-{fy_0q8۠kJ}y/e2FG o%Rlj]{o(%A͈</^9]*;'&9۟ZǓ [o*2DgnkZ S^/(^ٓ[|գf;zNZt]r˱t\yId>zwU{g_DA:69])bJa UW*O|!I 9]](Vu,&JF{5)p̙K "9ik4Zwa疴{i4瘑ۿ7LO^OtnFxۧ}Jқy#w/c~NW6sp-x(.f㚹*)c['FP"FOh4f3~'Gf;e/Ċ 'QI&iJ6QVP|weHyQ@=], IVye%^_Re3OF]T*.#Ȉ/A F6z<U}wVT7L!ߊ8[Ϭv]*bʫ$Q1{_Fb>p Qԕ&!nOf}H2FXKAݾ9U]atKӈx#Q!~f@?d+F0dB+~8 Θth\g+ʡaFr7}jsz\5 5?Btޛ EĶ!3f6#Hw'˵;KXLω˵{ 3>K%$FA1l6p!GLT;ĺwk1GLhp'+ j>"Y??A7YC~8~ H9ܦ _?= y#*Ԡw Q7&䍅CGF!By F_ﳑ'X3hBBf2T%ad]ڄ5nc;3ACbt-hO5g7(041ӖG\]n[7 _ftN;M PYXYT%hfB/dH\{EćMga&UeU+\9m *Eۍf@p"C䭾\9[&?;s0&8?Ԓ~znƣGPSK+{Drejۥ6ZOVgG* Yή:ȟ`Gљ84Բ{I)4|Dga_%dVcJE%tuƧ=1b΍ =iQSWtESw!\UNxga6%p=(/"7#_WOBJ}RXZO/sW(ŠYLkJ,| kҬݮ|]׫j Z%IaAM(f#o6<`>2M~$xb'&`"M/F4H#yV HdJB߾0Rؚ%s~.$PlAC01LC.硔q}g8 g%A[6T{ǎ66Mi8wHe Qͤ逜|#+qn^|m/rlRB.5[?)AssKdRjH >QRQK4d f e{=ri"D;[\UI^#]'?-wsq}>od;+w],$vJPWQ2\sԢ$$XRђ`IN5رB2ҶPJP/FD6PJ*P\\)B*(XJ,-HǽO39cLp=;֬ 9PUSTLG@P*YIm e7`2C0K`B (Hy,z*z:|քVAT"Uf~m|w_>^/:߿}%b%VhkBu=]EBL]O)܆p(>̈ߞm$~gFZwu;©F;N 06~.IpzB, u*1a d5j5vfjl&Hd0 UWo*+TUV2XbNHPх"n{Qd*Be}άѷe V0*4Pe41 qʊeqr6eŠ~w2;.s-4/ш񖆭f4@2o\u%xE- q6F 5qM2*|6d+|)&*ۓ }3ޮ-/oW &\a %-H0\n .jը~&w^$fOESϯ~n>\)2j}q6Vooz,g ERmݎ{JRnt2(u|?s/4V׷~xX|KCCgnx{st-&kn=o hlS9sP&WsUcJBhd܂"YZT+%3JUDY(,*2&-N:t>rih쾘7a6zv&J uƐ-:CV1Se6#Y apfM ai,T܊T"nGVCb3lUٞVQ_}"ILLb 5 hjrYZvy?OYOIy5|^~E[ [q(}kFh-u1i~4.3|f }a=yz]"?N$Ōz011AnMh4c[fcN_'_H[I~>Hzzt-&A4jxdogg8F.'XrUAEg *C2J2TPPE@ET{H'¼k AlĭBvTwFiiOD5%oA}BɸZᮒ xc_, pka &\.*M綘W_nS1tX'f,n=Nj^#FZ`oaye ,Wh/5񢺶7bՎۧb>yxX嬋PB]O*HJӇGEj+m+K^0Lݢ*TD;h? 7Q|C5;3%#l;Cǵ:5-PMbC:(T\a~bsu.sb$|L?$P\-,ᦊ2U-gg$;ݹCr_t5剡՝_~st:ݳDl8뉲*mvcU0z1c֊խ] Vbm=kSJ蔞UTz37N6%qi#Frޭ rLw4n#tC{藙ޭ 9smoSZT#VEn]ec:]w;anݑ3[r&cS"Ӧ,Q^hs:ӿxn~es:%v蓞u[!A>ū>̌_;4L4y\IY'&轞&2F5hm' ӥFS\bNڗڒb D@G5]%*; P|"( s¢O8>}x-w)F]\Gh ŤL5~2$w*=F4ۛ` iwJ@)p%!C#h< ?ZDM :o!ܚ ]*Ud_2_U%{5Ò']䌑XrL*a)fLG42cĸP'+(}'j S$S9MSŬx dpf.!2o߀e=<emI */{lӣk0=_6(*`[/@ '9W˷vl55l(ҧvlm)Q uԸp02Xg[Yo&BZuNP-knFK'>o@.JRtoO5t^ 2qΠ;)P2V}2C=/`>Tkh0ayW{GF!/{`ZaY<d6aw5f;N#9<̿/-ۭCR+08r[]uY 4xaxc`!Vr$S̓y+P/lVb+o=(=r#{=z/o\JaeJ> u6*nu0q^;r'z/%jti+,kAZ}-Gʙ¢)0.B]) 6ٮpoW/o7DmX7Pp覂hc@2 H q\4c/׷<\g1|~gOA9,/Ǜ߾Z,_Nۊov>j[?e<O9Ol w)z Gw ˟S/o_@*@3اԾ)ƒ-Ilne RŐDgz^mڤL1R}* ` A)ZḀҢu:gO*ݨQ@;Pkb(`p*VR')C<&"/e,xKT+2eI쇳S\T-*Jn(L#Ni? sq{B ܔ$b*)\'TU[t=O#"*=ж$jYv ڐ@KM)J> HyntT$њNcO w'PENeEj\v^$=?S1HL+*$lJ6D\,`;"6DgqڡҘsɚfUzO۹&7mO'jifTgHf1Ah6-6*?R`|}jGM 5DryfR`QbSaG8$^/MFJ*s1(& EBk*# k) "BRjV*+/0Hza#_ulRa'-Is7-AַV`wZUt/b.2_= ޿K/^ix:n$)quA} d; orO~JXEVL#\w{qr䍏VZ.p=SͧJaP?jm~sͫI>WOn*0T@2 B3zZ_Cm3eRSGɲE t\!6ʹ}_O=g,…ٗۢNx|nD)VKطǸhAw oF)̽e[Da 0G ,LjwPX>p-i½ r0:qN21b^$jz9`PJ_42%wRtn OVE5]3Z@2)Jczs]D"TDhG1x.LG4z n-moz{<\:&8t'bej\v'=:c 8*WL~Ҧ= Mġ)p #bCMbpo cgNQ! c 5Fqhٹ%k;WCAUq(^"B(2K 2ΝU)߰J&=˱h`Aִr9qY:ȊTQH>֡6.RzvRƑYwvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003575320315150464751017715 0ustar rootrootFeb 28 03:34:20 crc systemd[1]: Starting Kubernetes Kubelet... Feb 28 03:34:20 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:20 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 28 03:34:21 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 28 03:34:22 crc kubenswrapper[4819]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 03:34:22 crc kubenswrapper[4819]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 28 03:34:22 crc kubenswrapper[4819]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 03:34:22 crc kubenswrapper[4819]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 03:34:22 crc kubenswrapper[4819]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 28 03:34:22 crc kubenswrapper[4819]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.068511 4819 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.073948 4819 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.073986 4819 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.073998 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074008 4819 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074019 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074032 4819 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074059 4819 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074069 4819 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074077 4819 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074087 4819 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074097 4819 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074108 4819 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074118 4819 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074126 4819 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074135 4819 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074145 4819 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074153 4819 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074161 4819 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074170 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074178 4819 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074187 4819 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074196 4819 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074204 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074213 4819 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074221 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074229 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074238 4819 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074275 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074286 4819 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074300 4819 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074309 4819 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074319 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074328 4819 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074341 4819 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074350 4819 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074359 4819 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074368 4819 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074378 4819 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074386 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074394 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074402 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074410 4819 feature_gate.go:330] unrecognized feature gate: Example Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074418 4819 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074426 4819 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074434 4819 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074442 4819 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074449 4819 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074457 4819 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074465 4819 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074472 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074480 4819 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074488 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074496 4819 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074506 4819 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074514 4819 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074522 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074533 4819 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074542 4819 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074550 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074559 4819 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074566 4819 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074574 4819 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074582 4819 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074589 4819 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074598 4819 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074605 4819 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074613 4819 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074624 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074634 4819 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074642 4819 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.074650 4819 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075666 4819 flags.go:64] FLAG: --address="0.0.0.0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075691 4819 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075706 4819 flags.go:64] FLAG: --anonymous-auth="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075718 4819 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075730 4819 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075740 4819 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075752 4819 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075764 4819 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075773 4819 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075782 4819 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075792 4819 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075801 4819 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075811 4819 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075820 4819 flags.go:64] FLAG: --cgroup-root="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075829 4819 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075838 4819 flags.go:64] FLAG: --client-ca-file="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075847 4819 flags.go:64] FLAG: --cloud-config="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075855 4819 flags.go:64] FLAG: --cloud-provider="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075864 4819 flags.go:64] FLAG: --cluster-dns="[]" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075877 4819 flags.go:64] FLAG: --cluster-domain="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075886 4819 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075895 4819 flags.go:64] FLAG: --config-dir="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075904 4819 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075915 4819 flags.go:64] FLAG: --container-log-max-files="5" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075927 4819 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075937 4819 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075947 4819 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075958 4819 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075969 4819 flags.go:64] FLAG: --contention-profiling="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075979 4819 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075988 4819 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.075997 4819 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076006 4819 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076018 4819 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076027 4819 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076036 4819 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076045 4819 flags.go:64] FLAG: --enable-load-reader="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076054 4819 flags.go:64] FLAG: --enable-server="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076064 4819 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076074 4819 flags.go:64] FLAG: --event-burst="100" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076083 4819 flags.go:64] FLAG: --event-qps="50" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076093 4819 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076102 4819 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076111 4819 flags.go:64] FLAG: --eviction-hard="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076122 4819 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076131 4819 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076140 4819 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076149 4819 flags.go:64] FLAG: --eviction-soft="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076158 4819 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076167 4819 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076176 4819 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076185 4819 flags.go:64] FLAG: --experimental-mounter-path="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076193 4819 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076203 4819 flags.go:64] FLAG: --fail-swap-on="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076212 4819 flags.go:64] FLAG: --feature-gates="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076223 4819 flags.go:64] FLAG: --file-check-frequency="20s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076232 4819 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076241 4819 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076278 4819 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076289 4819 flags.go:64] FLAG: --healthz-port="10248" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076299 4819 flags.go:64] FLAG: --help="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076308 4819 flags.go:64] FLAG: --hostname-override="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076316 4819 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076325 4819 flags.go:64] FLAG: --http-check-frequency="20s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076335 4819 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076345 4819 flags.go:64] FLAG: --image-credential-provider-config="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076353 4819 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076362 4819 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076371 4819 flags.go:64] FLAG: --image-service-endpoint="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076379 4819 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076389 4819 flags.go:64] FLAG: --kube-api-burst="100" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076397 4819 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076407 4819 flags.go:64] FLAG: --kube-api-qps="50" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076415 4819 flags.go:64] FLAG: --kube-reserved="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076424 4819 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076434 4819 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076443 4819 flags.go:64] FLAG: --kubelet-cgroups="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076451 4819 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076460 4819 flags.go:64] FLAG: --lock-file="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076469 4819 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076478 4819 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076487 4819 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076501 4819 flags.go:64] FLAG: --log-json-split-stream="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076509 4819 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076518 4819 flags.go:64] FLAG: --log-text-split-stream="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076527 4819 flags.go:64] FLAG: --logging-format="text" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076536 4819 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076546 4819 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076556 4819 flags.go:64] FLAG: --manifest-url="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076565 4819 flags.go:64] FLAG: --manifest-url-header="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076577 4819 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076587 4819 flags.go:64] FLAG: --max-open-files="1000000" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076599 4819 flags.go:64] FLAG: --max-pods="110" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076608 4819 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076617 4819 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076626 4819 flags.go:64] FLAG: --memory-manager-policy="None" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076635 4819 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076645 4819 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076654 4819 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076663 4819 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076683 4819 flags.go:64] FLAG: --node-status-max-images="50" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076692 4819 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076702 4819 flags.go:64] FLAG: --oom-score-adj="-999" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076711 4819 flags.go:64] FLAG: --pod-cidr="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076720 4819 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076732 4819 flags.go:64] FLAG: --pod-manifest-path="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076741 4819 flags.go:64] FLAG: --pod-max-pids="-1" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076751 4819 flags.go:64] FLAG: --pods-per-core="0" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076760 4819 flags.go:64] FLAG: --port="10250" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076769 4819 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076777 4819 flags.go:64] FLAG: --provider-id="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076787 4819 flags.go:64] FLAG: --qos-reserved="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076795 4819 flags.go:64] FLAG: --read-only-port="10255" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076805 4819 flags.go:64] FLAG: --register-node="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076813 4819 flags.go:64] FLAG: --register-schedulable="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076822 4819 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076837 4819 flags.go:64] FLAG: --registry-burst="10" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076846 4819 flags.go:64] FLAG: --registry-qps="5" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076855 4819 flags.go:64] FLAG: --reserved-cpus="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076863 4819 flags.go:64] FLAG: --reserved-memory="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076874 4819 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076884 4819 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076893 4819 flags.go:64] FLAG: --rotate-certificates="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076909 4819 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076918 4819 flags.go:64] FLAG: --runonce="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076929 4819 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076939 4819 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076948 4819 flags.go:64] FLAG: --seccomp-default="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076957 4819 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076966 4819 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076975 4819 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076984 4819 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.076994 4819 flags.go:64] FLAG: --storage-driver-password="root" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077003 4819 flags.go:64] FLAG: --storage-driver-secure="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077012 4819 flags.go:64] FLAG: --storage-driver-table="stats" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077021 4819 flags.go:64] FLAG: --storage-driver-user="root" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077030 4819 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077041 4819 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077050 4819 flags.go:64] FLAG: --system-cgroups="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077058 4819 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077072 4819 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077081 4819 flags.go:64] FLAG: --tls-cert-file="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077090 4819 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077101 4819 flags.go:64] FLAG: --tls-min-version="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077110 4819 flags.go:64] FLAG: --tls-private-key-file="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077119 4819 flags.go:64] FLAG: --topology-manager-policy="none" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077128 4819 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077137 4819 flags.go:64] FLAG: --topology-manager-scope="container" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077146 4819 flags.go:64] FLAG: --v="2" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077158 4819 flags.go:64] FLAG: --version="false" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077169 4819 flags.go:64] FLAG: --vmodule="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077180 4819 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.077189 4819 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081057 4819 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081127 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081142 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081155 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081167 4819 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081179 4819 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081192 4819 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081201 4819 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081210 4819 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081219 4819 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081228 4819 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081239 4819 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081283 4819 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081293 4819 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081301 4819 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081310 4819 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081319 4819 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081329 4819 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081340 4819 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081352 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081361 4819 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081369 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081378 4819 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081387 4819 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081395 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081404 4819 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081414 4819 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081437 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081452 4819 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081465 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081480 4819 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081493 4819 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081504 4819 feature_gate.go:330] unrecognized feature gate: Example Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081513 4819 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081522 4819 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081531 4819 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081539 4819 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081548 4819 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081558 4819 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081570 4819 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081580 4819 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081588 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081597 4819 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081606 4819 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081613 4819 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081621 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081632 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081640 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081649 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081657 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081665 4819 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081673 4819 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081681 4819 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081689 4819 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081697 4819 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081709 4819 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081719 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081727 4819 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081737 4819 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081745 4819 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081753 4819 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081764 4819 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081777 4819 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081786 4819 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081795 4819 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081804 4819 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081812 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081821 4819 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081830 4819 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081839 4819 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.081846 4819 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.081865 4819 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.096149 4819 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.096200 4819 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096378 4819 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096393 4819 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096402 4819 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096411 4819 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096420 4819 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096428 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096437 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096445 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096452 4819 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096461 4819 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096468 4819 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096477 4819 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096485 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096493 4819 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096501 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096512 4819 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096522 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096530 4819 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096538 4819 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096546 4819 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096555 4819 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096563 4819 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096572 4819 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096579 4819 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096587 4819 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096596 4819 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096604 4819 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096612 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096621 4819 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096631 4819 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096642 4819 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096654 4819 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096666 4819 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096675 4819 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096687 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096695 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096704 4819 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096712 4819 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096721 4819 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096729 4819 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096738 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096747 4819 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096755 4819 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096765 4819 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096778 4819 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096787 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096796 4819 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096805 4819 feature_gate.go:330] unrecognized feature gate: Example Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096813 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096822 4819 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096830 4819 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096838 4819 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096849 4819 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096857 4819 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096866 4819 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096874 4819 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096881 4819 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096889 4819 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096897 4819 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096905 4819 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096913 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096921 4819 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096929 4819 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096937 4819 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096945 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096952 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096960 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096969 4819 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096977 4819 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096987 4819 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.096998 4819 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.097011 4819 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097286 4819 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097303 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097313 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097324 4819 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097333 4819 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097343 4819 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097352 4819 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097363 4819 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097375 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097383 4819 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097392 4819 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097401 4819 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097409 4819 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097419 4819 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097428 4819 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097437 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097447 4819 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097457 4819 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097466 4819 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097477 4819 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097487 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097496 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097506 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097515 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097524 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097533 4819 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097542 4819 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097551 4819 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097559 4819 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097567 4819 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097575 4819 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097583 4819 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097591 4819 feature_gate.go:330] unrecognized feature gate: Example Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097600 4819 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097619 4819 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097627 4819 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097637 4819 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097648 4819 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097657 4819 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097666 4819 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097675 4819 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097683 4819 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097691 4819 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097700 4819 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097708 4819 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097719 4819 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097729 4819 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097739 4819 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097751 4819 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097762 4819 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097771 4819 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097781 4819 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097789 4819 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097797 4819 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097806 4819 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097814 4819 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097822 4819 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097831 4819 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097839 4819 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097847 4819 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097857 4819 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097867 4819 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097878 4819 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097889 4819 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097899 4819 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097909 4819 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097920 4819 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097929 4819 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097937 4819 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097946 4819 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.097955 4819 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.097969 4819 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.098294 4819 server.go:940] "Client rotation is on, will bootstrap in background" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.104382 4819 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.112557 4819 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.112806 4819 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.114756 4819 server.go:997] "Starting client certificate rotation" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.114803 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.115105 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.149583 4819 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.150460 4819 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.154411 4819 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.173592 4819 log.go:25] "Validated CRI v1 runtime API" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.224487 4819 log.go:25] "Validated CRI v1 image API" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.226997 4819 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.236066 4819 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-28-03-30-51-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.236107 4819 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.261018 4819 manager.go:217] Machine: {Timestamp:2026-02-28 03:34:22.257697576 +0000 UTC m=+0.723266474 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:93f3ff1a-d0a3-46b4-b86c-112127fcdcca BootID:81735f3f-b725-4ecc-bf66-34a29471cd39 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:eb:1d:fb Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:eb:1d:fb Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:2a:c9:3b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:36:70:01 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:fa:08:7f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d4:75:05 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:f6:35:69:24:07 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:82:6d:1d:d5:83:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.261799 4819 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.262100 4819 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.262581 4819 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.262854 4819 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.262950 4819 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.263375 4819 topology_manager.go:138] "Creating topology manager with none policy" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.263445 4819 container_manager_linux.go:303] "Creating device plugin manager" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.264178 4819 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.264322 4819 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.265382 4819 state_mem.go:36] "Initialized new in-memory state store" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.265616 4819 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.271018 4819 kubelet.go:418] "Attempting to sync node with API server" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.271270 4819 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.271404 4819 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.271489 4819 kubelet.go:324] "Adding apiserver pod source" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.271666 4819 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.276536 4819 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.277782 4819 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.278447 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.278564 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.278874 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.279047 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.279461 4819 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282319 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282367 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282385 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282400 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282424 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282439 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282456 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282482 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282503 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282517 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282537 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.282551 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.283925 4819 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.284752 4819 server.go:1280] "Started kubelet" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.284747 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.285985 4819 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.286025 4819 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.287308 4819 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 28 03:34:22 crc systemd[1]: Started Kubernetes Kubelet. Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.288447 4819 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.288496 4819 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.288754 4819 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.288841 4819 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.289001 4819 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.288998 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.290139 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.290322 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.290895 4819 factory.go:55] Registering systemd factory Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.290935 4819 factory.go:221] Registration of the systemd container factory successfully Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.290925 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.291754 4819 factory.go:153] Registering CRI-O factory Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.291808 4819 factory.go:221] Registration of the crio container factory successfully Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.291938 4819 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.291985 4819 factory.go:103] Registering Raw factory Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.292020 4819 manager.go:1196] Started watching for new ooms in manager Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.297376 4819 manager.go:319] Starting recovery of all containers Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.303957 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18984bad95c91055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,LastTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.310061 4819 server.go:460] "Adding debug handlers to kubelet server" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316405 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316500 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316519 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316540 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316558 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316575 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316604 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316637 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316657 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316685 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316703 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316719 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316736 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316753 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316767 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316783 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316804 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316824 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316842 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316858 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316875 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316891 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316911 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316926 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316946 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316962 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316981 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.316997 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317050 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317069 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317092 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317108 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317134 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317162 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317179 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317225 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317240 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317284 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317299 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317320 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317341 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317357 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317373 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317395 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317415 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317431 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317448 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317464 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317484 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317500 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317516 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317534 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317557 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317579 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317597 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317614 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317635 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317774 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317804 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317833 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317853 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317882 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317900 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317919 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317941 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317961 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.317982 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318027 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318046 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318072 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318099 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318122 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318143 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318161 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318181 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318204 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318230 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318272 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318291 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318310 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318334 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318353 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318372 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318391 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318410 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318429 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318448 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318471 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318492 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318512 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318534 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318554 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318575 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318593 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318615 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318634 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318657 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318676 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318695 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318713 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318735 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318754 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318772 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318797 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318826 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318846 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318865 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318888 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318907 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318925 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318946 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318968 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.318988 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319008 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319026 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319043 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319064 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319081 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319098 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319114 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319129 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319145 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319162 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319178 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319197 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319212 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319226 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319240 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319283 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319299 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319314 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319333 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319347 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319365 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319383 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319405 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319421 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319439 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319460 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319477 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319493 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319509 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319526 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319563 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319585 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319604 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319621 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319641 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319660 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319678 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319695 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319712 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319735 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319755 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319774 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319794 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319820 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319837 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319853 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319870 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319893 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319909 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319925 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319944 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319963 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319979 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.319995 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320017 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320033 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320053 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320083 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320099 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320115 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320131 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320148 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320166 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320187 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320234 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320308 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320332 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320356 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320374 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320389 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320404 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320418 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320431 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320446 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320460 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320472 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320485 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320497 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320511 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320526 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320540 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320552 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320567 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320579 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320591 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320608 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320625 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320639 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320651 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320669 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.320686 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.323196 4819 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.323240 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.323287 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.323306 4819 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.323324 4819 reconstruct.go:97] "Volume reconstruction finished" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.323336 4819 reconciler.go:26] "Reconciler: start to sync state" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.334416 4819 manager.go:324] Recovery completed Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.357690 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.359309 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.359370 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.359392 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.363376 4819 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.365007 4819 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.365039 4819 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.365063 4819 state_mem.go:36] "Initialized new in-memory state store" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.367499 4819 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.367594 4819 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.367639 4819 kubelet.go:2335] "Starting kubelet main sync loop" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.367717 4819 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 28 03:34:22 crc kubenswrapper[4819]: W0228 03:34:22.368420 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.368490 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.389957 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.414024 4819 policy_none.go:49] "None policy: Start" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.415646 4819 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.415683 4819 state_mem.go:35] "Initializing new in-memory state store" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.468122 4819 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.470064 4819 manager.go:334] "Starting Device Plugin manager" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.470199 4819 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.470219 4819 server.go:79] "Starting device plugin registration server" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.470813 4819 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.470855 4819 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.471100 4819 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.471179 4819 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.471188 4819 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.485653 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.492728 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.571733 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.573626 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.573687 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.573705 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.573752 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.574499 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.669316 4819 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.669496 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.671214 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.671301 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.671341 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.671562 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.672126 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.672224 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.673753 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.673806 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.673853 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.673871 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.673825 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.673940 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.674194 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.674485 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.674567 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675492 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675529 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675550 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675826 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675903 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675938 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.675956 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.676909 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.677016 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678552 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678586 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678603 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678665 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678702 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678727 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.678799 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.679108 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.679213 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680672 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680719 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680738 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680765 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680793 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680809 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.680991 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.681033 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.682216 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.682295 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.682318 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728186 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728309 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728363 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728461 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728597 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728651 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728695 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728801 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.728902 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.729009 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.729121 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.729209 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.729325 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.729419 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.729518 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.774903 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.776911 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.776963 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.776985 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.777079 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.777953 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830799 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830861 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830900 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830931 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830965 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830996 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.830996 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831025 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831048 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831059 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831049 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831113 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831005 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831170 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831097 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831194 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831188 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831027 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831288 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831337 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831371 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831402 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831434 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831463 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831527 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831535 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831572 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831504 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831583 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: I0228 03:34:22.831546 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:22 crc kubenswrapper[4819]: E0228 03:34:22.896483 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.029288 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.043875 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.067929 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.081929 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.089815 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.090821 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-83ab717155cd2b0313e7c48b45e118b0989e11d7e3337ae4e47e177fcfa03335 WatchSource:0}: Error finding container 83ab717155cd2b0313e7c48b45e118b0989e11d7e3337ae4e47e177fcfa03335: Status 404 returned error can't find the container with id 83ab717155cd2b0313e7c48b45e118b0989e11d7e3337ae4e47e177fcfa03335 Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.092430 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e4fe071c3e1881e8697d265a9ea18882c0005da2ff56102a7be65842d5db70cc WatchSource:0}: Error finding container e4fe071c3e1881e8697d265a9ea18882c0005da2ff56102a7be65842d5db70cc: Status 404 returned error can't find the container with id e4fe071c3e1881e8697d265a9ea18882c0005da2ff56102a7be65842d5db70cc Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.099806 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b21a490d519e60c91f252f2fe41a2b6d4e3dd8ca4ab324bd93ca64fd43d731b3 WatchSource:0}: Error finding container b21a490d519e60c91f252f2fe41a2b6d4e3dd8ca4ab324bd93ca64fd43d731b3: Status 404 returned error can't find the container with id b21a490d519e60c91f252f2fe41a2b6d4e3dd8ca4ab324bd93ca64fd43d731b3 Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.107717 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a07487454c919728da16361954f37fd8dc08ed84e6f8dabf08f0470f8067b83f WatchSource:0}: Error finding container a07487454c919728da16361954f37fd8dc08ed84e6f8dabf08f0470f8067b83f: Status 404 returned error can't find the container with id a07487454c919728da16361954f37fd8dc08ed84e6f8dabf08f0470f8067b83f Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.133230 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e9db3ef09846b3ebcefde009eee2954f3e004854df34f0b02d753afb7057b179 WatchSource:0}: Error finding container e9db3ef09846b3ebcefde009eee2954f3e004854df34f0b02d753afb7057b179: Status 404 returned error can't find the container with id e9db3ef09846b3ebcefde009eee2954f3e004854df34f0b02d753afb7057b179 Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.178632 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.180466 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.180525 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.180545 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.180592 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.181531 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.270076 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.270285 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.286831 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.362817 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.362938 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.375159 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e9db3ef09846b3ebcefde009eee2954f3e004854df34f0b02d753afb7057b179"} Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.377154 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a07487454c919728da16361954f37fd8dc08ed84e6f8dabf08f0470f8067b83f"} Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.378417 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b21a490d519e60c91f252f2fe41a2b6d4e3dd8ca4ab324bd93ca64fd43d731b3"} Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.380428 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4fe071c3e1881e8697d265a9ea18882c0005da2ff56102a7be65842d5db70cc"} Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.381747 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"83ab717155cd2b0313e7c48b45e118b0989e11d7e3337ae4e47e177fcfa03335"} Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.477863 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.477943 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.697032 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Feb 28 03:34:23 crc kubenswrapper[4819]: W0228 03:34:23.782618 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.783135 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.981906 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.985086 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.985155 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.985187 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:23 crc kubenswrapper[4819]: I0228 03:34:23.985235 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:23 crc kubenswrapper[4819]: E0228 03:34:23.986006 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.287180 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.341709 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 03:34:24 crc kubenswrapper[4819]: E0228 03:34:24.343189 4819 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.389093 4819 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb" exitCode=0 Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.389297 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.389347 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb"} Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.391075 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.391126 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.391146 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.393755 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e774a8d641740281d566829c580c19c15c1309f5c24333701556e3cd984480b"} Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.393880 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5"} Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.396799 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355" exitCode=0 Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.396922 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355"} Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.397075 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.399650 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.399695 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.399716 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.400876 4819 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d" exitCode=0 Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.400932 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d"} Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.401105 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.402483 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.405922 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.406280 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.406327 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.408823 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.408908 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.408942 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.409634 4819 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6" exitCode=0 Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.409712 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6"} Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.410161 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.413794 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.413830 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:24 crc kubenswrapper[4819]: I0228 03:34:24.413844 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:25 crc kubenswrapper[4819]: W0228 03:34:25.142001 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:25 crc kubenswrapper[4819]: E0228 03:34:25.142106 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.286079 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:25 crc kubenswrapper[4819]: E0228 03:34:25.298230 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.418700 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4edad6d5aefa2b4525c99955b0f46074375ef146eac74482374c6563e137d2a2"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.418748 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.418782 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4333ea161996cd65ab3027eafa25efe039a1eaf4eae370bd93c5781ae44c00f"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.419678 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.419715 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.419727 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.422321 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.422385 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.422402 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.424540 4819 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec" exitCode=0 Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.424634 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.424635 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.425446 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.425492 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.425503 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.428778 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.428767 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.430562 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.430598 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.430611 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.438045 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.438098 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.438115 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495"} Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.438219 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.439634 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.439691 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.439706 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:25 crc kubenswrapper[4819]: W0228 03:34:25.466727 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Feb 28 03:34:25 crc kubenswrapper[4819]: E0228 03:34:25.467587 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.587003 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.588820 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.588872 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.588928 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:25 crc kubenswrapper[4819]: I0228 03:34:25.588964 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:25 crc kubenswrapper[4819]: E0228 03:34:25.589769 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.270123 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.278048 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.445032 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9"} Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.445120 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6f231baa96187f2712b4797caf8999d18b43fc6b4f9d1943f75315c8a50cb74a"} Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.445201 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.446485 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.446542 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.446564 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.448578 4819 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05" exitCode=0 Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.448627 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05"} Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.448738 4819 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.448779 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.448797 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.448806 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.449182 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.450499 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.450552 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.450574 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.450764 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.450818 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.450836 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.451182 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.451344 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.451375 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.451719 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.451761 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.451781 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:26 crc kubenswrapper[4819]: I0228 03:34:26.683474 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.457658 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f31b1e80c49f686cd4ba79a59efefe67ccb8b7ca7055a19bcf7338cfc97804bc"} Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.457736 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"931a5d438456f1c0cf96c2c26f70a55d60eb4557831639d66b47f068bdb2be31"} Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.457752 4819 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.457831 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.458058 4819 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.458137 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.459655 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.459725 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.459748 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.460580 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.460623 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.460637 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:27 crc kubenswrapper[4819]: I0228 03:34:27.894611 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.387484 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.470662 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8cf17e6bdcd3888d2a1bf313429c7fa99831233a4f38b71729844ec37dcb27c5"} Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.471656 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f57615eaa4744ed0df55d7c57709e96489e705b02cb6f0984eddf5999ffc730"} Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.470905 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.470792 4819 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.471844 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.473940 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.474007 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.474033 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.474742 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.474806 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.474831 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.590008 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.719939 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.789886 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.792157 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.792222 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.792243 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:28 crc kubenswrapper[4819]: I0228 03:34:28.792322 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.050288 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.482316 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa573d495e9a6c91589189b25a721deb399cc0cc50bd6581fa578607a7c93b3e"} Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.482458 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.482532 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.482568 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.484630 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.484685 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.484629 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.484739 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.484706 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.484767 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.485706 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.485757 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.485777 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.684585 4819 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.684771 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:34:29 crc kubenswrapper[4819]: I0228 03:34:29.899577 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.324325 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.324594 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.326232 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.326367 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.326392 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.487518 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.487624 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.489803 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.489873 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.489991 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.492975 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.493040 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:30 crc kubenswrapper[4819]: I0228 03:34:30.493062 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:31 crc kubenswrapper[4819]: I0228 03:34:31.490838 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:31 crc kubenswrapper[4819]: I0228 03:34:31.492437 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:31 crc kubenswrapper[4819]: I0228 03:34:31.492517 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:31 crc kubenswrapper[4819]: I0228 03:34:31.492537 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:32 crc kubenswrapper[4819]: E0228 03:34:32.486434 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:34:32 crc kubenswrapper[4819]: I0228 03:34:32.603602 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:32 crc kubenswrapper[4819]: I0228 03:34:32.603837 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:32 crc kubenswrapper[4819]: I0228 03:34:32.605347 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:32 crc kubenswrapper[4819]: I0228 03:34:32.605399 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:32 crc kubenswrapper[4819]: I0228 03:34:32.605418 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:33 crc kubenswrapper[4819]: I0228 03:34:33.373436 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 28 03:34:33 crc kubenswrapper[4819]: I0228 03:34:33.373712 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:33 crc kubenswrapper[4819]: I0228 03:34:33.376323 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:33 crc kubenswrapper[4819]: I0228 03:34:33.376399 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:33 crc kubenswrapper[4819]: I0228 03:34:33.376415 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:36 crc kubenswrapper[4819]: W0228 03:34:36.128114 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 28 03:34:36 crc kubenswrapper[4819]: I0228 03:34:36.128193 4819 trace.go:236] Trace[961040691]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Feb-2026 03:34:26.126) (total time: 10001ms): Feb 28 03:34:36 crc kubenswrapper[4819]: Trace[961040691]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (03:34:36.128) Feb 28 03:34:36 crc kubenswrapper[4819]: Trace[961040691]: [10.001474542s] [10.001474542s] END Feb 28 03:34:36 crc kubenswrapper[4819]: E0228 03:34:36.128214 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 28 03:34:36 crc kubenswrapper[4819]: I0228 03:34:36.286534 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 28 03:34:36 crc kubenswrapper[4819]: W0228 03:34:36.889602 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 28 03:34:36 crc kubenswrapper[4819]: I0228 03:34:36.890095 4819 trace.go:236] Trace[402625007]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Feb-2026 03:34:26.888) (total time: 10001ms): Feb 28 03:34:36 crc kubenswrapper[4819]: Trace[402625007]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (03:34:36.889) Feb 28 03:34:36 crc kubenswrapper[4819]: Trace[402625007]: [10.00167373s] [10.00167373s] END Feb 28 03:34:36 crc kubenswrapper[4819]: E0228 03:34:36.890139 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 28 03:34:37 crc kubenswrapper[4819]: E0228 03:34:37.506836 4819 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:37 crc kubenswrapper[4819]: E0228 03:34:37.510406 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18984bad95c91055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,LastTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:34:37 crc kubenswrapper[4819]: E0228 03:34:37.514950 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.517072 4819 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.517135 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 28 03:34:37 crc kubenswrapper[4819]: E0228 03:34:37.517168 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.521379 4819 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.521420 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.521695 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z Feb 28 03:34:37 crc kubenswrapper[4819]: W0228 03:34:37.525088 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z Feb 28 03:34:37 crc kubenswrapper[4819]: E0228 03:34:37.525177 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:37 crc kubenswrapper[4819]: W0228 03:34:37.537426 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z Feb 28 03:34:37 crc kubenswrapper[4819]: E0228 03:34:37.537520 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.816389 4819 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.816477 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.904286 4819 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]log ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]etcd ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/generic-apiserver-start-informers ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/priority-and-fairness-filter ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-apiextensions-informers ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-apiextensions-controllers ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/crd-informer-synced ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-system-namespaces-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 28 03:34:37 crc kubenswrapper[4819]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 28 03:34:37 crc kubenswrapper[4819]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/bootstrap-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/start-kube-aggregator-informers ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-registration-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-discovery-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]autoregister-completion ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-openapi-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 28 03:34:37 crc kubenswrapper[4819]: livez check failed Feb 28 03:34:37 crc kubenswrapper[4819]: I0228 03:34:37.904380 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.291301 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:38Z is after 2026-02-23T05:33:13Z Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.394219 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.394442 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.396155 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.396214 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.396233 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.515216 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.517771 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6f231baa96187f2712b4797caf8999d18b43fc6b4f9d1943f75315c8a50cb74a" exitCode=255 Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.517821 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6f231baa96187f2712b4797caf8999d18b43fc6b4f9d1943f75315c8a50cb74a"} Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.518020 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.519196 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.519276 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.519299 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:38 crc kubenswrapper[4819]: I0228 03:34:38.520203 4819 scope.go:117] "RemoveContainer" containerID="6f231baa96187f2712b4797caf8999d18b43fc6b4f9d1943f75315c8a50cb74a" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.290095 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:39Z is after 2026-02-23T05:33:13Z Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.524386 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.527621 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029"} Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.527838 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.529039 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.529099 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.529117 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.684420 4819 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.684527 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.938703 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.938935 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.940792 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.941011 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.941180 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:39 crc kubenswrapper[4819]: I0228 03:34:39.961030 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 28 03:34:40 crc kubenswrapper[4819]: W0228 03:34:40.224133 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:40Z is after 2026-02-23T05:33:13Z Feb 28 03:34:40 crc kubenswrapper[4819]: E0228 03:34:40.224282 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.290502 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:40Z is after 2026-02-23T05:33:13Z Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.533602 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.534345 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.536308 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" exitCode=255 Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.536495 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.536732 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029"} Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.536971 4819 scope.go:117] "RemoveContainer" containerID="6f231baa96187f2712b4797caf8999d18b43fc6b4f9d1943f75315c8a50cb74a" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.537175 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.537709 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.537757 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.537778 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.538416 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.538456 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.538476 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:40 crc kubenswrapper[4819]: I0228 03:34:40.539342 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:34:40 crc kubenswrapper[4819]: E0228 03:34:40.539712 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:34:41 crc kubenswrapper[4819]: I0228 03:34:41.290157 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:41Z is after 2026-02-23T05:33:13Z Feb 28 03:34:41 crc kubenswrapper[4819]: I0228 03:34:41.540497 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.290421 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:42Z is after 2026-02-23T05:33:13Z Feb 28 03:34:42 crc kubenswrapper[4819]: W0228 03:34:42.316579 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:42Z is after 2026-02-23T05:33:13Z Feb 28 03:34:42 crc kubenswrapper[4819]: E0228 03:34:42.316731 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:42 crc kubenswrapper[4819]: E0228 03:34:42.487457 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.903496 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.903768 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.905411 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.905475 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.905499 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.906356 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:34:42 crc kubenswrapper[4819]: E0228 03:34:42.906687 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:34:42 crc kubenswrapper[4819]: I0228 03:34:42.910863 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.288471 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:43Z is after 2026-02-23T05:33:13Z Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.549425 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.550752 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.550814 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.550838 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.551905 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:34:43 crc kubenswrapper[4819]: E0228 03:34:43.552312 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.918315 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.919798 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.919857 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.919878 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:43 crc kubenswrapper[4819]: I0228 03:34:43.919925 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:43 crc kubenswrapper[4819]: E0228 03:34:43.921064 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:43Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 28 03:34:43 crc kubenswrapper[4819]: E0228 03:34:43.924808 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:43Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 03:34:44 crc kubenswrapper[4819]: I0228 03:34:44.290274 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:44Z is after 2026-02-23T05:33:13Z Feb 28 03:34:44 crc kubenswrapper[4819]: W0228 03:34:44.667716 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:44Z is after 2026-02-23T05:33:13Z Feb 28 03:34:44 crc kubenswrapper[4819]: E0228 03:34:44.667823 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:45 crc kubenswrapper[4819]: I0228 03:34:45.291109 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:45Z is after 2026-02-23T05:33:13Z Feb 28 03:34:45 crc kubenswrapper[4819]: I0228 03:34:45.858087 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 03:34:45 crc kubenswrapper[4819]: E0228 03:34:45.863532 4819 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:46 crc kubenswrapper[4819]: I0228 03:34:46.290891 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:46Z is after 2026-02-23T05:33:13Z Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.291484 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:47Z is after 2026-02-23T05:33:13Z Feb 28 03:34:47 crc kubenswrapper[4819]: E0228 03:34:47.515541 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18984bad95c91055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,LastTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.815749 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.816059 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.817390 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.817446 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.817464 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:47 crc kubenswrapper[4819]: I0228 03:34:47.818374 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:34:47 crc kubenswrapper[4819]: E0228 03:34:47.818728 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:34:48 crc kubenswrapper[4819]: I0228 03:34:48.289065 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:48Z is after 2026-02-23T05:33:13Z Feb 28 03:34:48 crc kubenswrapper[4819]: W0228 03:34:48.320575 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:48Z is after 2026-02-23T05:33:13Z Feb 28 03:34:48 crc kubenswrapper[4819]: E0228 03:34:48.320671 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.050355 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.050537 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.051888 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.051958 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.051978 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.052838 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:34:49 crc kubenswrapper[4819]: E0228 03:34:49.053143 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.291449 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:49Z is after 2026-02-23T05:33:13Z Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.684936 4819 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.685027 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.685095 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.685302 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.686381 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.686421 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.686433 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.687015 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4e774a8d641740281d566829c580c19c15c1309f5c24333701556e3cd984480b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 28 03:34:49 crc kubenswrapper[4819]: I0228 03:34:49.687191 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://4e774a8d641740281d566829c580c19c15c1309f5c24333701556e3cd984480b" gracePeriod=30 Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.291284 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:50Z is after 2026-02-23T05:33:13Z Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.578067 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.578642 4819 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4e774a8d641740281d566829c580c19c15c1309f5c24333701556e3cd984480b" exitCode=255 Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.578749 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4e774a8d641740281d566829c580c19c15c1309f5c24333701556e3cd984480b"} Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.578846 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150"} Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.579020 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.580402 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.580447 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.580460 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:50 crc kubenswrapper[4819]: W0228 03:34:50.825717 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:50Z is after 2026-02-23T05:33:13Z Feb 28 03:34:50 crc kubenswrapper[4819]: E0228 03:34:50.825828 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.924978 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:50 crc kubenswrapper[4819]: E0228 03:34:50.927238 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:50Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.927550 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.927611 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.927631 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:50 crc kubenswrapper[4819]: I0228 03:34:50.927670 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:50 crc kubenswrapper[4819]: E0228 03:34:50.932728 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:50Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 03:34:51 crc kubenswrapper[4819]: I0228 03:34:51.291564 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:51Z is after 2026-02-23T05:33:13Z Feb 28 03:34:51 crc kubenswrapper[4819]: W0228 03:34:51.612715 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:51Z is after 2026-02-23T05:33:13Z Feb 28 03:34:51 crc kubenswrapper[4819]: E0228 03:34:51.612802 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:34:52 crc kubenswrapper[4819]: I0228 03:34:52.291170 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:52Z is after 2026-02-23T05:33:13Z Feb 28 03:34:52 crc kubenswrapper[4819]: E0228 03:34:52.488397 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:34:52 crc kubenswrapper[4819]: I0228 03:34:52.604371 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:52 crc kubenswrapper[4819]: I0228 03:34:52.604573 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:52 crc kubenswrapper[4819]: I0228 03:34:52.606086 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:52 crc kubenswrapper[4819]: I0228 03:34:52.606124 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:52 crc kubenswrapper[4819]: I0228 03:34:52.606136 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:53 crc kubenswrapper[4819]: I0228 03:34:53.290616 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:53Z is after 2026-02-23T05:33:13Z Feb 28 03:34:54 crc kubenswrapper[4819]: I0228 03:34:54.290434 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:54Z is after 2026-02-23T05:33:13Z Feb 28 03:34:55 crc kubenswrapper[4819]: I0228 03:34:55.291818 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:55Z is after 2026-02-23T05:33:13Z Feb 28 03:34:56 crc kubenswrapper[4819]: I0228 03:34:56.294814 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:56Z is after 2026-02-23T05:33:13Z Feb 28 03:34:56 crc kubenswrapper[4819]: I0228 03:34:56.683945 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:34:56 crc kubenswrapper[4819]: I0228 03:34:56.684144 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:56 crc kubenswrapper[4819]: I0228 03:34:56.685543 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:56 crc kubenswrapper[4819]: I0228 03:34:56.685589 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:56 crc kubenswrapper[4819]: I0228 03:34:56.685608 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:57 crc kubenswrapper[4819]: I0228 03:34:57.405672 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:57Z is after 2026-02-23T05:33:13Z Feb 28 03:34:57 crc kubenswrapper[4819]: E0228 03:34:57.520565 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:57Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18984bad95c91055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,LastTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:34:57 crc kubenswrapper[4819]: I0228 03:34:57.933233 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:34:57 crc kubenswrapper[4819]: E0228 03:34:57.934522 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:57Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 28 03:34:57 crc kubenswrapper[4819]: I0228 03:34:57.934857 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:34:57 crc kubenswrapper[4819]: I0228 03:34:57.934895 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:34:57 crc kubenswrapper[4819]: I0228 03:34:57.934906 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:34:57 crc kubenswrapper[4819]: I0228 03:34:57.934933 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:34:57 crc kubenswrapper[4819]: E0228 03:34:57.939718 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:57Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 03:34:58 crc kubenswrapper[4819]: I0228 03:34:58.288763 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:58Z is after 2026-02-23T05:33:13Z Feb 28 03:34:59 crc kubenswrapper[4819]: I0228 03:34:59.291980 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:34:59Z is after 2026-02-23T05:33:13Z Feb 28 03:34:59 crc kubenswrapper[4819]: I0228 03:34:59.684907 4819 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:34:59 crc kubenswrapper[4819]: I0228 03:34:59.685030 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:35:00 crc kubenswrapper[4819]: I0228 03:35:00.291001 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:00Z is after 2026-02-23T05:33:13Z Feb 28 03:35:01 crc kubenswrapper[4819]: I0228 03:35:01.291098 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:01Z is after 2026-02-23T05:33:13Z Feb 28 03:35:01 crc kubenswrapper[4819]: I0228 03:35:01.946359 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 03:35:01 crc kubenswrapper[4819]: E0228 03:35:01.951707 4819 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:35:01 crc kubenswrapper[4819]: E0228 03:35:01.952929 4819 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 28 03:35:02 crc kubenswrapper[4819]: I0228 03:35:02.292164 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:02Z is after 2026-02-23T05:33:13Z Feb 28 03:35:02 crc kubenswrapper[4819]: I0228 03:35:02.368847 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:02 crc kubenswrapper[4819]: I0228 03:35:02.370375 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:02 crc kubenswrapper[4819]: I0228 03:35:02.370625 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:02 crc kubenswrapper[4819]: I0228 03:35:02.370778 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:02 crc kubenswrapper[4819]: I0228 03:35:02.371892 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:35:02 crc kubenswrapper[4819]: E0228 03:35:02.488505 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.290777 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:03Z is after 2026-02-23T05:33:13Z Feb 28 03:35:03 crc kubenswrapper[4819]: W0228 03:35:03.294061 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:03Z is after 2026-02-23T05:33:13Z Feb 28 03:35:03 crc kubenswrapper[4819]: E0228 03:35:03.294190 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.620405 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.621297 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.624685 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a" exitCode=255 Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.624725 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a"} Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.624796 4819 scope.go:117] "RemoveContainer" containerID="17705c617ee0ba8a217ff0d3c82c8dcd71d88ebe4460028009d1634979697029" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.624946 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.626337 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.626396 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.626415 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:03 crc kubenswrapper[4819]: I0228 03:35:03.627375 4819 scope.go:117] "RemoveContainer" containerID="e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a" Feb 28 03:35:03 crc kubenswrapper[4819]: E0228 03:35:03.627688 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.291361 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:04Z is after 2026-02-23T05:33:13Z Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.629406 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.939867 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:04 crc kubenswrapper[4819]: E0228 03:35:04.940820 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:04Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.941920 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.941986 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.942007 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:04 crc kubenswrapper[4819]: I0228 03:35:04.942044 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:35:04 crc kubenswrapper[4819]: E0228 03:35:04.947002 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:04Z is after 2026-02-23T05:33:13Z" node="crc" Feb 28 03:35:05 crc kubenswrapper[4819]: I0228 03:35:05.292166 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:05Z is after 2026-02-23T05:33:13Z Feb 28 03:35:06 crc kubenswrapper[4819]: I0228 03:35:06.294574 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:06Z is after 2026-02-23T05:33:13Z Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.290861 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:07Z is after 2026-02-23T05:33:13Z Feb 28 03:35:07 crc kubenswrapper[4819]: E0228 03:35:07.526170 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:07Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18984bad95c91055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,LastTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:07 crc kubenswrapper[4819]: W0228 03:35:07.807397 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:07Z is after 2026-02-23T05:33:13Z Feb 28 03:35:07 crc kubenswrapper[4819]: E0228 03:35:07.807524 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.815654 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.815868 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.817294 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.817344 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.817362 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:07 crc kubenswrapper[4819]: I0228 03:35:07.818032 4819 scope.go:117] "RemoveContainer" containerID="e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a" Feb 28 03:35:07 crc kubenswrapper[4819]: E0228 03:35:07.818338 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:08 crc kubenswrapper[4819]: I0228 03:35:08.290490 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:08Z is after 2026-02-23T05:33:13Z Feb 28 03:35:08 crc kubenswrapper[4819]: W0228 03:35:08.724081 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:08Z is after 2026-02-23T05:33:13Z Feb 28 03:35:08 crc kubenswrapper[4819]: E0228 03:35:08.724194 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.051306 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.051554 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.052990 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.053031 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.053044 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.053611 4819 scope.go:117] "RemoveContainer" containerID="e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a" Feb 28 03:35:09 crc kubenswrapper[4819]: E0228 03:35:09.053804 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.288772 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:09Z is after 2026-02-23T05:33:13Z Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.685480 4819 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:35:09 crc kubenswrapper[4819]: I0228 03:35:09.685567 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:35:10 crc kubenswrapper[4819]: I0228 03:35:10.291600 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:10Z is after 2026-02-23T05:33:13Z Feb 28 03:35:10 crc kubenswrapper[4819]: I0228 03:35:10.331070 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 28 03:35:10 crc kubenswrapper[4819]: I0228 03:35:10.331301 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:10 crc kubenswrapper[4819]: I0228 03:35:10.332863 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:10 crc kubenswrapper[4819]: I0228 03:35:10.332916 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:10 crc kubenswrapper[4819]: I0228 03:35:10.332940 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:11 crc kubenswrapper[4819]: I0228 03:35:11.290900 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:35:11Z is after 2026-02-23T05:33:13Z Feb 28 03:35:11 crc kubenswrapper[4819]: I0228 03:35:11.947107 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:11 crc kubenswrapper[4819]: W0228 03:35:11.947703 4819 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 28 03:35:11 crc kubenswrapper[4819]: E0228 03:35:11.947767 4819 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 28 03:35:11 crc kubenswrapper[4819]: E0228 03:35:11.947848 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 03:35:11 crc kubenswrapper[4819]: I0228 03:35:11.948987 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:11 crc kubenswrapper[4819]: I0228 03:35:11.949046 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:11 crc kubenswrapper[4819]: I0228 03:35:11.949067 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:11 crc kubenswrapper[4819]: I0228 03:35:11.949110 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:35:11 crc kubenswrapper[4819]: E0228 03:35:11.956106 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 03:35:12 crc kubenswrapper[4819]: I0228 03:35:12.293467 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:12 crc kubenswrapper[4819]: E0228 03:35:12.488604 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:35:13 crc kubenswrapper[4819]: I0228 03:35:13.293565 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:14 crc kubenswrapper[4819]: I0228 03:35:14.292600 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:15 crc kubenswrapper[4819]: I0228 03:35:15.292640 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:16 crc kubenswrapper[4819]: I0228 03:35:16.294003 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:17 crc kubenswrapper[4819]: I0228 03:35:17.293497 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.533474 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad95c91055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,LastTimestamp:2026-02-28 03:34:22.284697685 +0000 UTC m=+0.750266583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.540651 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.547115 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.555232 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.562452 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bada1229aea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.475115242 +0000 UTC m=+0.940684100,LastTimestamp:2026-02-28 03:34:22.475115242 +0000 UTC m=+0.940684100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.569173 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.573667137 +0000 UTC m=+1.039236025,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.578328 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.573698417 +0000 UTC m=+1.039267305,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.584882 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cfe6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.573714498 +0000 UTC m=+1.039283396,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.591624 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.671242406 +0000 UTC m=+1.136811304,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.599198 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.671325748 +0000 UTC m=+1.136894636,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.604554 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cfe6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.671353669 +0000 UTC m=+1.136922677,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.610765 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.673779421 +0000 UTC m=+1.139348319,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.617003 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.673839842 +0000 UTC m=+1.139408740,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.623190 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.673864523 +0000 UTC m=+1.139433421,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.633320 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cfe6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.673883423 +0000 UTC m=+1.139452321,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.640002 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.673927434 +0000 UTC m=+1.139496322,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.646683 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cfe6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.673950895 +0000 UTC m=+1.139519793,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.653551 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.675520975 +0000 UTC m=+1.141089873,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.660511 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.675542446 +0000 UTC m=+1.141111344,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.667557 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cfe6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.675561446 +0000 UTC m=+1.141130334,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.674162 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.675924305 +0000 UTC m=+1.141493203,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.681355 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.675949996 +0000 UTC m=+1.141518894,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.687587 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cfe6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cfe6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359404141 +0000 UTC m=+0.824973039,LastTimestamp:2026-02-28 03:34:22.675966766 +0000 UTC m=+1.141535664,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.694126 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3c44e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3c44e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.359356649 +0000 UTC m=+0.824925547,LastTimestamp:2026-02-28 03:34:22.678579033 +0000 UTC m=+1.144147921,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.701195 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18984bad9a3cb2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18984bad9a3cb2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:22.35938475 +0000 UTC m=+0.824953648,LastTimestamp:2026-02-28 03:34:22.678597743 +0000 UTC m=+1.144166641,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.709395 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984badc63e49f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.097686515 +0000 UTC m=+1.563255373,LastTimestamp:2026-02-28 03:34:23.097686515 +0000 UTC m=+1.563255373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.715815 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984badc64a9d61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.098494305 +0000 UTC m=+1.564063163,LastTimestamp:2026-02-28 03:34:23.098494305 +0000 UTC m=+1.564063163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.722464 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984badc6c54250 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.10653192 +0000 UTC m=+1.572100768,LastTimestamp:2026-02-28 03:34:23.10653192 +0000 UTC m=+1.572100768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.728561 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984badc743d702 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.114827522 +0000 UTC m=+1.580396390,LastTimestamp:2026-02-28 03:34:23.114827522 +0000 UTC m=+1.580396390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.734944 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984badc8fd793e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.14377043 +0000 UTC m=+1.609339318,LastTimestamp:2026-02-28 03:34:23.14377043 +0000 UTC m=+1.609339318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.741795 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984badeb794037 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.722307639 +0000 UTC m=+2.187876537,LastTimestamp:2026-02-28 03:34:23.722307639 +0000 UTC m=+2.187876537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.746628 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984badec3fcbaf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.735319471 +0000 UTC m=+2.200888339,LastTimestamp:2026-02-28 03:34:23.735319471 +0000 UTC m=+2.200888339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.753357 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984badecce908b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.744675979 +0000 UTC m=+2.210244847,LastTimestamp:2026-02-28 03:34:23.744675979 +0000 UTC m=+2.210244847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.759408 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984badeccf706b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.744733291 +0000 UTC m=+2.210302159,LastTimestamp:2026-02-28 03:34:23.744733291 +0000 UTC m=+2.210302159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.764377 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984badeccf9005 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.744741381 +0000 UTC m=+2.210310279,LastTimestamp:2026-02-28 03:34:23.744741381 +0000 UTC m=+2.210310279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.768723 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984badecdaf0e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.74548708 +0000 UTC m=+2.211055948,LastTimestamp:2026-02-28 03:34:23.74548708 +0000 UTC m=+2.211055948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.773356 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984baded036378 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.748137848 +0000 UTC m=+2.213706746,LastTimestamp:2026-02-28 03:34:23.748137848 +0000 UTC m=+2.213706746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.778760 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984baded855eed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.756656365 +0000 UTC m=+2.222225233,LastTimestamp:2026-02-28 03:34:23.756656365 +0000 UTC m=+2.222225233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.784753 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984badee07a847 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.765194823 +0000 UTC m=+2.230763681,LastTimestamp:2026-02-28 03:34:23.765194823 +0000 UTC m=+2.230763681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.790944 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984badee934472 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.774344306 +0000 UTC m=+2.239913174,LastTimestamp:2026-02-28 03:34:23.774344306 +0000 UTC m=+2.239913174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.796955 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984badeeadf1a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.776092581 +0000 UTC m=+2.241661439,LastTimestamp:2026-02-28 03:34:23.776092581 +0000 UTC m=+2.241661439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.803231 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae02f8d8f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.116545776 +0000 UTC m=+2.582114674,LastTimestamp:2026-02-28 03:34:24.116545776 +0000 UTC m=+2.582114674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.810348 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae03c160ef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.129687791 +0000 UTC m=+2.595256689,LastTimestamp:2026-02-28 03:34:24.129687791 +0000 UTC m=+2.595256689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.816960 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae03d776e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.131135208 +0000 UTC m=+2.596704106,LastTimestamp:2026-02-28 03:34:24.131135208 +0000 UTC m=+2.596704106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.823948 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae1388c214 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.394412564 +0000 UTC m=+2.859981452,LastTimestamp:2026-02-28 03:34:24.394412564 +0000 UTC m=+2.859981452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.830771 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae13fe231f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.402105119 +0000 UTC m=+2.867674007,LastTimestamp:2026-02-28 03:34:24.402105119 +0000 UTC m=+2.867674007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.837877 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae149ad323 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.412373795 +0000 UTC m=+2.877942683,LastTimestamp:2026-02-28 03:34:24.412373795 +0000 UTC m=+2.877942683,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.844737 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984bae14d85218 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.416403992 +0000 UTC m=+2.881972880,LastTimestamp:2026-02-28 03:34:24.416403992 +0000 UTC m=+2.881972880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.851171 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae15409732 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.423237426 +0000 UTC m=+2.888806324,LastTimestamp:2026-02-28 03:34:24.423237426 +0000 UTC m=+2.888806324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.857584 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae176cda06 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.45969255 +0000 UTC m=+2.925261418,LastTimestamp:2026-02-28 03:34:24.45969255 +0000 UTC m=+2.925261418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.862022 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae1788b76f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.461518703 +0000 UTC m=+2.927087571,LastTimestamp:2026-02-28 03:34:24.461518703 +0000 UTC m=+2.927087571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.868510 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae21bc1766 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.632657766 +0000 UTC m=+3.098226624,LastTimestamp:2026-02-28 03:34:24.632657766 +0000 UTC m=+3.098226624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.874711 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae229f39c1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.647543233 +0000 UTC m=+3.113112091,LastTimestamp:2026-02-28 03:34:24.647543233 +0000 UTC m=+3.113112091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.880642 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae22b9f6f5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.649295605 +0000 UTC m=+3.114864463,LastTimestamp:2026-02-28 03:34:24.649295605 +0000 UTC m=+3.114864463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.886720 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae23462913 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.658483475 +0000 UTC m=+3.124052333,LastTimestamp:2026-02-28 03:34:24.658483475 +0000 UTC m=+3.124052333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.892440 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae238407e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.662538212 +0000 UTC m=+3.128107070,LastTimestamp:2026-02-28 03:34:24.662538212 +0000 UTC m=+3.128107070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.898087 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984bae23be62d8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.666362584 +0000 UTC m=+3.131931452,LastTimestamp:2026-02-28 03:34:24.666362584 +0000 UTC m=+3.131931452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.903470 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae24660991 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.677349777 +0000 UTC m=+3.142918635,LastTimestamp:2026-02-28 03:34:24.677349777 +0000 UTC m=+3.142918635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.909418 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae24953a04 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.680442372 +0000 UTC m=+3.146011230,LastTimestamp:2026-02-28 03:34:24.680442372 +0000 UTC m=+3.146011230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.915594 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae24a3c379 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.681395065 +0000 UTC m=+3.146963923,LastTimestamp:2026-02-28 03:34:24.681395065 +0000 UTC m=+3.146963923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.921923 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18984bae25170632 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.688948786 +0000 UTC m=+3.154517644,LastTimestamp:2026-02-28 03:34:24.688948786 +0000 UTC m=+3.154517644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.925979 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae2602d67a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.704403066 +0000 UTC m=+3.169971924,LastTimestamp:2026-02-28 03:34:24.704403066 +0000 UTC m=+3.169971924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.929662 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae305dd24c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.878137932 +0000 UTC m=+3.343706790,LastTimestamp:2026-02-28 03:34:24.878137932 +0000 UTC m=+3.343706790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.935558 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae316ef89f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.896039071 +0000 UTC m=+3.361607919,LastTimestamp:2026-02-28 03:34:24.896039071 +0000 UTC m=+3.361607919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.941600 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae31b0d4ac openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.900355244 +0000 UTC m=+3.365924102,LastTimestamp:2026-02-28 03:34:24.900355244 +0000 UTC m=+3.365924102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.947723 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae31c395ec openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.901584364 +0000 UTC m=+3.367153222,LastTimestamp:2026-02-28 03:34:24.901584364 +0000 UTC m=+3.367153222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.953945 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae336cfef4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.929464052 +0000 UTC m=+3.395032920,LastTimestamp:2026-02-28 03:34:24.929464052 +0000 UTC m=+3.395032920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.959682 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae3397b32c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.9322627 +0000 UTC m=+3.397831558,LastTimestamp:2026-02-28 03:34:24.9322627 +0000 UTC m=+3.397831558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.963505 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae400fdfb8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.141465016 +0000 UTC m=+3.607033874,LastTimestamp:2026-02-28 03:34:25.141465016 +0000 UTC m=+3.607033874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.966718 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18984bae413bf7f1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.161132017 +0000 UTC m=+3.626700895,LastTimestamp:2026-02-28 03:34:25.161132017 +0000 UTC m=+3.626700895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.977717 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae4185505e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.165938782 +0000 UTC m=+3.631507640,LastTimestamp:2026-02-28 03:34:25.165938782 +0000 UTC m=+3.631507640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.984592 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae42a2a49f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.184638111 +0000 UTC m=+3.650206969,LastTimestamp:2026-02-28 03:34:25.184638111 +0000 UTC m=+3.650206969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.990686 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae42d16f9e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.187704734 +0000 UTC m=+3.653273592,LastTimestamp:2026-02-28 03:34:25.187704734 +0000 UTC m=+3.653273592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:17 crc kubenswrapper[4819]: E0228 03:35:17.997012 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae42e69f80 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.189093248 +0000 UTC m=+3.654662126,LastTimestamp:2026-02-28 03:34:25.189093248 +0000 UTC m=+3.654662126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.003467 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae4fbf7cde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.404632286 +0000 UTC m=+3.870201164,LastTimestamp:2026-02-28 03:34:25.404632286 +0000 UTC m=+3.870201164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.012999 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae51374f7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.429262206 +0000 UTC m=+3.894831074,LastTimestamp:2026-02-28 03:34:25.429262206 +0000 UTC m=+3.894831074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.019588 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae515cad3b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.431711035 +0000 UTC m=+3.897279903,LastTimestamp:2026-02-28 03:34:25.431711035 +0000 UTC m=+3.897279903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.025890 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae5172a9a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.43315191 +0000 UTC m=+3.898720778,LastTimestamp:2026-02-28 03:34:25.43315191 +0000 UTC m=+3.898720778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.032747 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae606ed324 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.684558628 +0000 UTC m=+4.150127506,LastTimestamp:2026-02-28 03:34:25.684558628 +0000 UTC m=+4.150127506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.039002 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae60c9296f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.690478959 +0000 UTC m=+4.156047827,LastTimestamp:2026-02-28 03:34:25.690478959 +0000 UTC m=+4.156047827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.045284 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bae615d003e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.700167742 +0000 UTC m=+4.165736600,LastTimestamp:2026-02-28 03:34:25.700167742 +0000 UTC m=+4.165736600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.051912 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae61c31130 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:25.706856752 +0000 UTC m=+4.172425610,LastTimestamp:2026-02-28 03:34:25.706856752 +0000 UTC m=+4.172425610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.059584 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984bae8e4b1b70 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:26.453969776 +0000 UTC m=+4.919538674,LastTimestamp:2026-02-28 03:34:26.453969776 +0000 UTC m=+4.919538674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.065726 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baeb12b31bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.039080895 +0000 UTC m=+5.504649783,LastTimestamp:2026-02-28 03:34:27.039080895 +0000 UTC m=+5.504649783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.072066 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baeb27d1804 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.061225476 +0000 UTC m=+5.526794374,LastTimestamp:2026-02-28 03:34:27.061225476 +0000 UTC m=+5.526794374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.078073 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baeb2994a28 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.06307332 +0000 UTC m=+5.528642218,LastTimestamp:2026-02-28 03:34:27.06307332 +0000 UTC m=+5.528642218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.084719 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baec4e2d24b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.369882187 +0000 UTC m=+5.835451075,LastTimestamp:2026-02-28 03:34:27.369882187 +0000 UTC m=+5.835451075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.090915 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baec5e1951f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.386578207 +0000 UTC m=+5.852147095,LastTimestamp:2026-02-28 03:34:27.386578207 +0000 UTC m=+5.852147095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.097320 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baec601ad38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.388681528 +0000 UTC m=+5.854250426,LastTimestamp:2026-02-28 03:34:27.388681528 +0000 UTC m=+5.854250426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.104355 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baed9a756d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.718305491 +0000 UTC m=+6.183874369,LastTimestamp:2026-02-28 03:34:27.718305491 +0000 UTC m=+6.183874369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.110580 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baedc879a18 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.766557208 +0000 UTC m=+6.232126076,LastTimestamp:2026-02-28 03:34:27.766557208 +0000 UTC m=+6.232126076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.116919 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baedca3cbec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:27.768404972 +0000 UTC m=+6.233973860,LastTimestamp:2026-02-28 03:34:27.768404972 +0000 UTC m=+6.233973860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.120801 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baeecd69d43 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:28.040170819 +0000 UTC m=+6.505739707,LastTimestamp:2026-02-28 03:34:28.040170819 +0000 UTC m=+6.505739707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.126785 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baeedf42919 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:28.058884377 +0000 UTC m=+6.524453275,LastTimestamp:2026-02-28 03:34:28.058884377 +0000 UTC m=+6.524453275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.133004 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baeee0ee3cd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:28.060636109 +0000 UTC m=+6.526204977,LastTimestamp:2026-02-28 03:34:28.060636109 +0000 UTC m=+6.526204977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.139710 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baf0b6df0d9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:28.553404633 +0000 UTC m=+7.018973521,LastTimestamp:2026-02-28 03:34:28.553404633 +0000 UTC m=+7.018973521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.145985 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18984baf0c813ede openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:28.571447006 +0000 UTC m=+7.037015894,LastTimestamp:2026-02-28 03:34:28.571447006 +0000 UTC m=+7.037015894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.153824 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-controller-manager-crc.18984baf4edc8756 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 03:35:18 crc kubenswrapper[4819]: body: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.68472559 +0000 UTC m=+8.150294458,LastTimestamp:2026-02-28 03:34:29.68472559 +0000 UTC m=+8.150294458,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.160164 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984baf4ede2da6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.684833702 +0000 UTC m=+8.150402570,LastTimestamp:2026-02-28 03:34:29.684833702 +0000 UTC m=+8.150402570,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.168287 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-apiserver-crc.18984bb121b5574d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 28 03:35:18 crc kubenswrapper[4819]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 03:35:18 crc kubenswrapper[4819]: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.517117261 +0000 UTC m=+15.982686159,LastTimestamp:2026-02-28 03:34:37.517117261 +0000 UTC m=+15.982686159,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.174365 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bb121b610db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.517164763 +0000 UTC m=+15.982733651,LastTimestamp:2026-02-28 03:34:37.517164763 +0000 UTC m=+15.982733651,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.180666 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984bb121b5574d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-apiserver-crc.18984bb121b5574d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 28 03:35:18 crc kubenswrapper[4819]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 28 03:35:18 crc kubenswrapper[4819]: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.517117261 +0000 UTC m=+15.982686159,LastTimestamp:2026-02-28 03:34:37.5214088 +0000 UTC m=+15.986977658,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.187066 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18984bb121b610db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bb121b610db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.517164763 +0000 UTC m=+15.982733651,LastTimestamp:2026-02-28 03:34:37.521438731 +0000 UTC m=+15.987007589,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.193369 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-apiserver-crc.18984bb1338ce04a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Feb 28 03:35:18 crc kubenswrapper[4819]: body: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.816455242 +0000 UTC m=+16.282024140,LastTimestamp:2026-02-28 03:34:37.816455242 +0000 UTC m=+16.282024140,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.199597 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18984bb1338db5f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.816509944 +0000 UTC m=+16.282078842,LastTimestamp:2026-02-28 03:34:37.816509944 +0000 UTC m=+16.282078842,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.205850 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-apiserver-crc.18984bb138ca2ada openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Feb 28 03:35:18 crc kubenswrapper[4819]: body: [+]ping ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]log ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]etcd ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/generic-apiserver-start-informers ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/priority-and-fairness-filter ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-apiextensions-informers ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-apiextensions-controllers ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/crd-informer-synced ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-system-namespaces-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 28 03:35:18 crc kubenswrapper[4819]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 28 03:35:18 crc kubenswrapper[4819]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/bootstrap-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/start-kube-aggregator-informers ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-registration-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-discovery-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]autoregister-completion ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-openapi-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 28 03:35:18 crc kubenswrapper[4819]: livez check failed Feb 28 03:35:18 crc kubenswrapper[4819]: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:37.904358106 +0000 UTC m=+16.369926994,LastTimestamp:2026-02-28 03:34:37.904358106 +0000 UTC m=+16.369926994,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.215742 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-controller-manager-crc.18984bb1a2e4dd4a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 28 03:35:18 crc kubenswrapper[4819]: body: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:39.684492618 +0000 UTC m=+18.150061506,LastTimestamp:2026-02-28 03:34:39.684492618 +0000 UTC m=+18.150061506,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.222457 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bb1a2e615ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:39.68457265 +0000 UTC m=+18.150141548,LastTimestamp:2026-02-28 03:34:39.68457265 +0000 UTC m=+18.150141548,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.227181 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984baf4edc8756\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-controller-manager-crc.18984baf4edc8756 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 03:35:18 crc kubenswrapper[4819]: body: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.68472559 +0000 UTC m=+8.150294458,LastTimestamp:2026-02-28 03:34:49.685008175 +0000 UTC m=+28.150577053,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.233925 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984baf4ede2da6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984baf4ede2da6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.684833702 +0000 UTC m=+8.150402570,LastTimestamp:2026-02-28 03:34:49.685061716 +0000 UTC m=+28.150630584,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.240488 4819 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bb3f719a24c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:49.68717166 +0000 UTC m=+28.152740528,LastTimestamp:2026-02-28 03:34:49.68717166 +0000 UTC m=+28.152740528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.246775 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984baded036378\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984baded036378 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:23.748137848 +0000 UTC m=+2.213706746,LastTimestamp:2026-02-28 03:34:49.808602501 +0000 UTC m=+28.274171399,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.253288 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984bae02f8d8f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae02f8d8f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.116545776 +0000 UTC m=+2.582114674,LastTimestamp:2026-02-28 03:34:50.17230526 +0000 UTC m=+28.637874148,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.259664 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984bae03c160ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984bae03c160ef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:24.129687791 +0000 UTC m=+2.595256689,LastTimestamp:2026-02-28 03:34:50.182051886 +0000 UTC m=+28.647620764,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.267741 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984baf4edc8756\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-controller-manager-crc.18984baf4edc8756 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 03:35:18 crc kubenswrapper[4819]: body: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.68472559 +0000 UTC m=+8.150294458,LastTimestamp:2026-02-28 03:34:59.684989947 +0000 UTC m=+38.150558835,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.274394 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984baf4ede2da6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18984baf4ede2da6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.684833702 +0000 UTC m=+8.150402570,LastTimestamp:2026-02-28 03:34:59.685076619 +0000 UTC m=+38.150645517,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.282665 4819 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18984baf4edc8756\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 28 03:35:18 crc kubenswrapper[4819]: &Event{ObjectMeta:{kube-controller-manager-crc.18984baf4edc8756 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 28 03:35:18 crc kubenswrapper[4819]: body: Feb 28 03:35:18 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:34:29.68472559 +0000 UTC m=+8.150294458,LastTimestamp:2026-02-28 03:35:09.685540519 +0000 UTC m=+48.151109407,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:35:18 crc kubenswrapper[4819]: > Feb 28 03:35:18 crc kubenswrapper[4819]: I0228 03:35:18.292432 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.954763 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 03:35:18 crc kubenswrapper[4819]: I0228 03:35:18.956902 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:18 crc kubenswrapper[4819]: I0228 03:35:18.958352 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:18 crc kubenswrapper[4819]: I0228 03:35:18.958392 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:18 crc kubenswrapper[4819]: I0228 03:35:18.958405 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:18 crc kubenswrapper[4819]: I0228 03:35:18.958434 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:35:18 crc kubenswrapper[4819]: E0228 03:35:18.963236 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.293958 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.684274 4819 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.684347 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.684418 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.684610 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.686311 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.686396 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.686420 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.687222 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 28 03:35:19 crc kubenswrapper[4819]: I0228 03:35:19.687438 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150" gracePeriod=30 Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.292354 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.675902 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.676961 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.677229 4819 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150" exitCode=255 Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.677278 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150"} Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.677329 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dbd59238a97546f03c63a8a8c6ad93f642502a72bb7ce6282447988051d68581"} Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.677351 4819 scope.go:117] "RemoveContainer" containerID="4e774a8d641740281d566829c580c19c15c1309f5c24333701556e3cd984480b" Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.677395 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.679294 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.679320 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:20 crc kubenswrapper[4819]: I0228 03:35:20.679331 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:21 crc kubenswrapper[4819]: I0228 03:35:21.292276 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:21 crc kubenswrapper[4819]: I0228 03:35:21.681723 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 28 03:35:21 crc kubenswrapper[4819]: I0228 03:35:21.682783 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:21 crc kubenswrapper[4819]: I0228 03:35:21.683910 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:21 crc kubenswrapper[4819]: I0228 03:35:21.683954 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:21 crc kubenswrapper[4819]: I0228 03:35:21.683966 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:22 crc kubenswrapper[4819]: I0228 03:35:22.292962 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:22 crc kubenswrapper[4819]: E0228 03:35:22.490022 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:35:22 crc kubenswrapper[4819]: I0228 03:35:22.604107 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:35:22 crc kubenswrapper[4819]: I0228 03:35:22.685820 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:22 crc kubenswrapper[4819]: I0228 03:35:22.687167 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:22 crc kubenswrapper[4819]: I0228 03:35:22.687234 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:22 crc kubenswrapper[4819]: I0228 03:35:22.687276 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:23 crc kubenswrapper[4819]: I0228 03:35:23.293838 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.293742 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.368857 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.371636 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.371721 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.371748 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.372978 4819 scope.go:117] "RemoveContainer" containerID="e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.694997 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.697152 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6"} Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.697371 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.698708 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.698746 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:24 crc kubenswrapper[4819]: I0228 03:35:24.698758 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.287558 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.702523 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.703177 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.705333 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" exitCode=255 Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.705399 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6"} Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.705468 4819 scope.go:117] "RemoveContainer" containerID="e4562c304642d62a207be4533f8067e89e5fb6b39b6f0aef6bc6ee0ba6ca8f6a" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.705755 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.707899 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.707963 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.707983 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.708915 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:35:25 crc kubenswrapper[4819]: E0228 03:35:25.709213 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:25 crc kubenswrapper[4819]: E0228 03:35:25.962378 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.964352 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.966068 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.966199 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.966291 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:25 crc kubenswrapper[4819]: I0228 03:35:25.966402 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:35:25 crc kubenswrapper[4819]: E0228 03:35:25.974286 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.290890 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.683758 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.683874 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.684646 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.684671 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.684680 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.688513 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.708947 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.710984 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.711801 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.711836 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:26 crc kubenswrapper[4819]: I0228 03:35:26.711848 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.292010 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.815816 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.815966 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.816998 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.817058 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.817073 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:27 crc kubenswrapper[4819]: I0228 03:35:27.817820 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:35:27 crc kubenswrapper[4819]: E0228 03:35:27.818011 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:28 crc kubenswrapper[4819]: I0228 03:35:28.292819 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.050837 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.051072 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.052520 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.052569 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.052587 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.053454 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:35:29 crc kubenswrapper[4819]: E0228 03:35:29.053732 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.292824 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.367914 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.369012 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.369066 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:29 crc kubenswrapper[4819]: I0228 03:35:29.369083 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:30 crc kubenswrapper[4819]: I0228 03:35:30.292303 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:31 crc kubenswrapper[4819]: I0228 03:35:31.294346 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.293655 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:32 crc kubenswrapper[4819]: E0228 03:35:32.490168 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.610380 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.610609 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.612133 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.612219 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.612280 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:32 crc kubenswrapper[4819]: E0228 03:35:32.968190 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.975183 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.976858 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.976919 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.976939 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:32 crc kubenswrapper[4819]: I0228 03:35:32.976986 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:35:32 crc kubenswrapper[4819]: E0228 03:35:32.983237 4819 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 28 03:35:33 crc kubenswrapper[4819]: I0228 03:35:33.290559 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:33 crc kubenswrapper[4819]: I0228 03:35:33.955172 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 28 03:35:33 crc kubenswrapper[4819]: I0228 03:35:33.970468 4819 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 28 03:35:34 crc kubenswrapper[4819]: I0228 03:35:34.289281 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:35 crc kubenswrapper[4819]: I0228 03:35:35.293276 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:36 crc kubenswrapper[4819]: I0228 03:35:36.292755 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:37 crc kubenswrapper[4819]: I0228 03:35:37.293579 4819 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 28 03:35:37 crc kubenswrapper[4819]: I0228 03:35:37.427670 4819 csr.go:261] certificate signing request csr-7dxd4 is approved, waiting to be issued Feb 28 03:35:37 crc kubenswrapper[4819]: I0228 03:35:37.438127 4819 csr.go:257] certificate signing request csr-7dxd4 is issued Feb 28 03:35:37 crc kubenswrapper[4819]: I0228 03:35:37.513915 4819 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 28 03:35:38 crc kubenswrapper[4819]: I0228 03:35:38.115461 4819 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 28 03:35:38 crc kubenswrapper[4819]: I0228 03:35:38.439739 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 08:46:48.707177835 +0000 UTC Feb 28 03:35:38 crc kubenswrapper[4819]: I0228 03:35:38.439787 4819 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7301h11m10.267395059s for next certificate rotation Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.368983 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.370890 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.370954 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.370972 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.371954 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:35:39 crc kubenswrapper[4819]: E0228 03:35:39.372239 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.984169 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.985882 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.985932 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.985990 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.986168 4819 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.995648 4819 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 28 03:35:39 crc kubenswrapper[4819]: I0228 03:35:39.996017 4819 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 28 03:35:39 crc kubenswrapper[4819]: E0228 03:35:39.996052 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.000749 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.000786 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.000803 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.000825 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.000842 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:40Z","lastTransitionTime":"2026-02-28T03:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.023133 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.033843 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.033909 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.033933 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.033992 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.034025 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:40Z","lastTransitionTime":"2026-02-28T03:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.049094 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.059923 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.059989 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.060013 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.060046 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.060068 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:40Z","lastTransitionTime":"2026-02-28T03:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.070833 4819 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.077856 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.082454 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.082506 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.082526 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.082550 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.082566 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:40Z","lastTransitionTime":"2026-02-28T03:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.099058 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.099376 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.099423 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.199817 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.300382 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.400950 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.501148 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.601465 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.702397 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: I0228 03:35:40.739745 4819 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.803190 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:40 crc kubenswrapper[4819]: E0228 03:35:40.904049 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.004682 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.105444 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.206468 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.307145 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.407729 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.508941 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.609853 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.710815 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.811431 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:41 crc kubenswrapper[4819]: E0228 03:35:41.911547 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.012477 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.112999 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.214420 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.315003 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.415736 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.490309 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.516343 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.616616 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.717350 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.817910 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:42 crc kubenswrapper[4819]: E0228 03:35:42.918320 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.019370 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.120366 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.221437 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.321540 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.421756 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.522964 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.623860 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.724239 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.825289 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:43 crc kubenswrapper[4819]: E0228 03:35:43.926309 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.027344 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.128488 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.229336 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.329966 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.430565 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.531358 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.632659 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.733707 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.834497 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:44 crc kubenswrapper[4819]: E0228 03:35:44.935584 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.035861 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.136800 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.237115 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.337836 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.438456 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.538959 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.639301 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.740107 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.840321 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:45 crc kubenswrapper[4819]: E0228 03:35:45.941527 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.041706 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.141846 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.242807 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.343310 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.443766 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.544723 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.645766 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.746442 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.846850 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:46 crc kubenswrapper[4819]: E0228 03:35:46.947448 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.048451 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.149401 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.250490 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.351110 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: I0228 03:35:47.368722 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:47 crc kubenswrapper[4819]: I0228 03:35:47.369976 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:47 crc kubenswrapper[4819]: I0228 03:35:47.370019 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:47 crc kubenswrapper[4819]: I0228 03:35:47.370033 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.452139 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.553202 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.653776 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.754019 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.854993 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:47 crc kubenswrapper[4819]: E0228 03:35:47.956414 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.057899 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.159120 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.259778 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.359930 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.460672 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.561412 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.661756 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.762930 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.864053 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:48 crc kubenswrapper[4819]: E0228 03:35:48.964308 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.065290 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.166435 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.267552 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.367707 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.469164 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.569592 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.670648 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.771116 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.871448 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:49 crc kubenswrapper[4819]: E0228 03:35:49.971857 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.072934 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.173730 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.274307 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.375470 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.463578 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.468278 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.468327 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.468345 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.468370 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.468388 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:50Z","lastTransitionTime":"2026-02-28T03:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.485057 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.490428 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.490477 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.490497 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.490520 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.490538 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:50Z","lastTransitionTime":"2026-02-28T03:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.506133 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.510106 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.510193 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.510221 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.510291 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.510315 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:50Z","lastTransitionTime":"2026-02-28T03:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.525687 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.530378 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.530440 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.530453 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.530471 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:50 crc kubenswrapper[4819]: I0228 03:35:50.530484 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:50Z","lastTransitionTime":"2026-02-28T03:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.544281 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.544420 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.544448 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.645504 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.746657 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.847076 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:50 crc kubenswrapper[4819]: E0228 03:35:50.947968 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.048087 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.148209 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.249213 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.349707 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: I0228 03:35:51.368158 4819 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 28 03:35:51 crc kubenswrapper[4819]: I0228 03:35:51.369608 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:51 crc kubenswrapper[4819]: I0228 03:35:51.369660 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:51 crc kubenswrapper[4819]: I0228 03:35:51.369679 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:51 crc kubenswrapper[4819]: I0228 03:35:51.370611 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.370882 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.450863 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.551820 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.652377 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.753618 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.854884 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:51 crc kubenswrapper[4819]: E0228 03:35:51.955892 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.057103 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.158113 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.259072 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.359791 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.460530 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.490690 4819 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.561106 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.661873 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.762639 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.863021 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:52 crc kubenswrapper[4819]: E0228 03:35:52.963824 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.064489 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.165211 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.266094 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.367177 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.467659 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.568791 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.670078 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.771076 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.871813 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:53 crc kubenswrapper[4819]: E0228 03:35:53.973839 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.074841 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.175483 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.276504 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.377421 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.477620 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.577961 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.678345 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.778918 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.879313 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:54 crc kubenswrapper[4819]: E0228 03:35:54.979704 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.080376 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.180748 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.281869 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.382923 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.483073 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.583274 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.683491 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.784209 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.885282 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:55 crc kubenswrapper[4819]: E0228 03:35:55.985941 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.086962 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.187491 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.287993 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.388439 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.489607 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.589760 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.690145 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.790285 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.891034 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:56 crc kubenswrapper[4819]: E0228 03:35:56.991910 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.092321 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.193408 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.294347 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.394454 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.495451 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.595554 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.696067 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.796307 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.897278 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:57 crc kubenswrapper[4819]: E0228 03:35:57.998344 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.099370 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.199915 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.300904 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.400988 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.501527 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.602702 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.703291 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.803788 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:58 crc kubenswrapper[4819]: E0228 03:35:58.904583 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.005028 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.066606 4819 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.105947 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.207097 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.307532 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.407839 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.508146 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.608327 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: E0228 03:35:59.708693 4819 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.721945 4819 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.811648 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.811711 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.811731 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.811756 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.811776 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:59Z","lastTransitionTime":"2026-02-28T03:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.914672 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.914707 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.914717 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.914732 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:35:59 crc kubenswrapper[4819]: I0228 03:35:59.914743 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:35:59Z","lastTransitionTime":"2026-02-28T03:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.018451 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.018497 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.018509 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.018527 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.018539 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.125563 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.125650 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.125674 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.125698 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.125716 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.228674 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.228735 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.228753 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.228780 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.228795 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.332006 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.332058 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.332075 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.332099 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.332116 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.435565 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.435633 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.435651 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.435682 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.435708 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.439847 4819 apiserver.go:52] "Watching apiserver" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.447082 4819 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.447641 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-krp5h","openshift-machine-config-operator/machine-config-daemon-rw4hn","openshift-multus/multus-5ldpg","openshift-multus/multus-additional-cni-plugins-b8c5l","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-njv8f"] Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448151 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448482 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448937 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448981 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448387 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.449049 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.448660 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448557 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.448512 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.449511 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.449572 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.449638 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.450589 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.450705 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.452597 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.452916 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.452950 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.453047 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.453167 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.453580 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.456952 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.456995 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.457736 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.457994 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.458220 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.458716 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.459286 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.459879 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.461334 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.461404 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.461696 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.461712 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.463665 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.466029 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.466170 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.467041 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.467507 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.467540 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.467818 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.467966 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.468030 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.468375 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.468681 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.468885 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.469669 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.481306 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.491300 4819 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.498714 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.525582 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533216 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533326 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533375 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533419 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533460 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533505 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533548 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533592 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533634 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533676 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533721 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533761 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533803 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533850 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533890 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533935 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.533982 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534028 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534068 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534107 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534240 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534320 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534367 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534416 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534462 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534506 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534550 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534592 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534634 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534680 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534725 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534771 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534814 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534855 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534898 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534945 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.534988 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535030 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535076 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535119 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535164 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535210 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535285 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535331 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535376 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535423 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535469 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535511 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535554 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535604 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535647 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535691 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535748 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535790 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535838 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535882 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535931 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.535981 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536024 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536068 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536112 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536159 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536201 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536243 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536269 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536321 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536370 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536418 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536464 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536538 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536588 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536592 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536691 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536738 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536774 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536807 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536838 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536869 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536899 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536930 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536965 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.536999 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537030 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537067 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537098 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537128 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537160 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537193 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537221 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537272 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537306 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537335 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537367 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537398 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537427 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537458 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537490 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537519 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537550 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537581 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537613 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537645 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537673 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537702 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537734 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537763 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537793 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537822 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537851 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537882 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537914 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537914 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537945 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.537979 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538009 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538037 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538066 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538097 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538127 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538156 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538185 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538218 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538273 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538309 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538340 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538370 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538400 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538430 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538457 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538490 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538520 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538552 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538584 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538616 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538648 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538663 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538680 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538715 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538747 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538779 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538810 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538840 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538873 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538908 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538948 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.538977 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539013 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539044 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539074 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539103 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539134 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539164 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539195 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539227 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539426 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539462 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539495 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539525 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539557 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539588 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539623 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539655 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539691 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539724 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539754 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539784 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539815 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539848 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539881 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539915 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539945 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.539979 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540013 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540115 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540148 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540189 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540221 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540271 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540309 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540339 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540371 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540402 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540432 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540475 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540529 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540561 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540593 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540626 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540657 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540688 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540720 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540755 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540785 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540816 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540849 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540879 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540913 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540944 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.540976 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541059 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cni-binary-copy\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541102 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541134 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541164 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caffcb28-383d-4424-a641-7dd1f36080c8-ovn-node-metrics-cert\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541192 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541225 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-kubelet\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541271 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-netns\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541300 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-hostroot\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541327 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-log-socket\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541354 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-config\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541386 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjp4\" (UniqueName: \"kubernetes.io/projected/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-kube-api-access-pbjp4\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541419 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78f6484e-91d1-4345-baad-9f39f49a3915-cni-binary-copy\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541454 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541482 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-node-log\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541510 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgl7f\" (UniqueName: \"kubernetes.io/projected/ebdb39d5-8593-4a70-a0cd-c4701f9e58da-kube-api-access-fgl7f\") pod \"node-resolver-krp5h\" (UID: \"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\") " pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541551 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-k8s-cni-cncf-io\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541588 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541620 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541649 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-systemd\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541679 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-etc-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541706 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541741 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541771 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-system-cni-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541800 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-cni-multus\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541874 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-kubelet\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541905 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-rootfs\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541936 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78f6484e-91d1-4345-baad-9f39f49a3915-multus-daemon-config\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541967 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-netns\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.541995 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-bin\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542029 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542062 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-system-cni-dir\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542093 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cnibin\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542122 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-conf-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542154 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542183 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-socket-dir-parent\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542212 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-etc-kubernetes\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542267 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542302 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-var-lib-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542336 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542371 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542400 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-cni-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542430 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-os-release\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542460 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9lwq\" (UniqueName: \"kubernetes.io/projected/caffcb28-383d-4424-a641-7dd1f36080c8-kube-api-access-h9lwq\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542491 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542522 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mgf\" (UniqueName: \"kubernetes.io/projected/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-kube-api-access-88mgf\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542550 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-slash\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542586 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542619 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-os-release\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542651 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-multus-certs\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542682 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76t9\" (UniqueName: \"kubernetes.io/projected/78f6484e-91d1-4345-baad-9f39f49a3915-kube-api-access-v76t9\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542708 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-cni-bin\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542745 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542780 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542808 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-systemd-units\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542837 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-netd\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542867 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-mcd-auth-proxy-config\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542896 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ebdb39d5-8593-4a70-a0cd-c4701f9e58da-hosts-file\") pod \"node-resolver-krp5h\" (UID: \"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\") " pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542901 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542923 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-env-overrides\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542950 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-script-lib\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.542981 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-proxy-tls\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543010 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-cnibin\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543037 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-ovn\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543065 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543102 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543196 4819 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543218 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543239 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543376 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543542 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543703 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.543889 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.544126 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.544426 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.544827 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.545401 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.545746 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.545791 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546070 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546105 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546273 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546358 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546579 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546826 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.546930 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.547084 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.547083 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.547341 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.547418 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.547683 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.547986 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.548198 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.548272 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.548349 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.548447 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.550316 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.550545 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.554553 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.554782 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.555387 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.555430 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.555631 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.555718 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.556036 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.556144 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.556142 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.556511 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.556514 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.561015 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.561344 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.561533 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.561692 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.556656 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.563241 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.563477 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.564394 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.564847 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.565186 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.565407 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.565792 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.566200 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.566274 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.566547 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.566679 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.566759 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.567154 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.567242 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.567623 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.568134 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.569113 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.569489 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.569820 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.570282 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.570348 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.570690 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.571661 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.564804 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.572393 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.572703 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.573055 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.573164 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.573156 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.573601 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.573706 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.573776 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.574294 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.574319 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.574348 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.574676 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.574847 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.569432 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.544711 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.575947 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.578605 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.578814 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.579120 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.579340 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.579586 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.579542 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.579848 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.580450 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.580845 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.580914 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.582598 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.583526 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.583652 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.583727 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.575957 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.583876 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.583925 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.583956 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584078 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.584103 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.584212 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.584332 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584287 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584474 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584553 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584635 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584718 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584744 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584753 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584769 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584780 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584940 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584987 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.585379 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.585633 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.585751 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.584123 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.581893 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.585880 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586203 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586139 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586319 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586461 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586511 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586523 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.586702 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.587009 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.587185 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.587213 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.587262 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.587505 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.585797 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.587771 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588077 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588225 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588415 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588615 4819 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588710 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588757 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588826 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.588952 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.589280 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.589395 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.589492 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.589608 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.589779 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.589827 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.589848 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.589862 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.590179 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.590807 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.590962 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.590864 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.591353 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.591482 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.591558 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.591857 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.591991 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592276 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592302 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592448 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592676 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592709 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592846 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.592939 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.593070 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.593448 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:01.093424785 +0000 UTC m=+99.558993643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.593548 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:01.093538977 +0000 UTC m=+99.559107835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.593635 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:01.093625779 +0000 UTC m=+99.559194647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.593787 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.594020 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.594277 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.594360 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.594648 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.595599 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.596336 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:36:01.09631635 +0000 UTC m=+99.561885208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.596614 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.596643 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.596943 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.597098 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.597186 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.597399 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.597838 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.598049 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.598103 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599068 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599161 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599479 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599461 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599705 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599853 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.599942 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.600240 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.600518 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.600611 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.600703 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.601395 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.603302 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.603463 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.603864 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.604038 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.604661 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.604711 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.609287 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.609377 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.609389 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.609514 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.609487 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:01.109471678 +0000 UTC m=+99.575040536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.610549 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.610767 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.612608 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.612756 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.612950 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.613383 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.613454 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.615602 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.615913 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.616051 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.623229 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.623388 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.623628 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.628718 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.630076 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.630109 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.630122 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.630140 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.630153 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.635444 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.639538 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.640051 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.642591 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.642637 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.642698 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.642717 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.642728 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.645140 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.647175 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.651527 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.653911 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.653993 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.654149 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.654312 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.654464 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.658236 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.662656 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: E0228 03:36:00.662978 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.664714 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.664766 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.664785 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.664815 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.664833 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.667036 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.680334 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688726 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-multus-certs\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688778 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76t9\" (UniqueName: \"kubernetes.io/projected/78f6484e-91d1-4345-baad-9f39f49a3915-kube-api-access-v76t9\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688814 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-os-release\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688845 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-netd\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688868 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-multus-certs\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688878 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-mcd-auth-proxy-config\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688945 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ebdb39d5-8593-4a70-a0cd-c4701f9e58da-hosts-file\") pod \"node-resolver-krp5h\" (UID: \"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\") " pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.688967 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-cni-bin\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689004 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-systemd-units\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689023 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-cnibin\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689040 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-env-overrides\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689057 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-script-lib\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689075 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-proxy-tls\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689101 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-ovn\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689120 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689147 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cni-binary-copy\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689175 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-kubelet\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689194 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689211 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caffcb28-383d-4424-a641-7dd1f36080c8-ovn-node-metrics-cert\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689229 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689224 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-os-release\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689266 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjp4\" (UniqueName: \"kubernetes.io/projected/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-kube-api-access-pbjp4\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689338 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78f6484e-91d1-4345-baad-9f39f49a3915-cni-binary-copy\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689360 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-netns\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689381 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-hostroot\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689399 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-log-socket\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689421 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-config\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689440 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl7f\" (UniqueName: \"kubernetes.io/projected/ebdb39d5-8593-4a70-a0cd-c4701f9e58da-kube-api-access-fgl7f\") pod \"node-resolver-krp5h\" (UID: \"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\") " pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689465 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-k8s-cni-cncf-io\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689483 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-node-log\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689502 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689506 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-ovn\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689541 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-systemd\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689564 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-etc-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689581 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689601 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-rootfs\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689620 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-system-cni-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689640 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-cni-multus\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689656 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-kubelet\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689673 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-system-cni-dir\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689688 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cnibin\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689705 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-conf-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689719 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78f6484e-91d1-4345-baad-9f39f49a3915-multus-daemon-config\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689743 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-netns\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689765 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-bin\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689789 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689818 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-cni-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689836 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-os-release\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689852 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-socket-dir-parent\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689870 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-etc-kubernetes\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689887 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-var-lib-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689905 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-slash\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689961 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9lwq\" (UniqueName: \"kubernetes.io/projected/caffcb28-383d-4424-a641-7dd1f36080c8-kube-api-access-h9lwq\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689958 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-mcd-auth-proxy-config\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.689981 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690000 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mgf\" (UniqueName: \"kubernetes.io/projected/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-kube-api-access-88mgf\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690041 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-netd\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690149 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-log-socket\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690141 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-hostroot\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690187 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690204 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-kubelet\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690228 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-system-cni-dir\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690267 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cnibin\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690292 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-conf-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690642 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-systemd-units\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690642 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690722 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-kubelet\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690753 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-cnibin\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690762 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ebdb39d5-8593-4a70-a0cd-c4701f9e58da-hosts-file\") pod \"node-resolver-krp5h\" (UID: \"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\") " pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690771 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-cni-bin\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.690840 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-k8s-cni-cncf-io\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691285 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-slash\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691378 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-netns\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691447 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-bin\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691478 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-node-log\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691500 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-systemd\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691626 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-etc-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691637 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-cni-binary-copy\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691758 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-run-netns\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691765 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-system-cni-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691778 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691821 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691822 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691842 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78f6484e-91d1-4345-baad-9f39f49a3915-multus-daemon-config\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691875 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-rootfs\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691883 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-cni-dir\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691918 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-os-release\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691920 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-multus-socket-dir-parent\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691933 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78f6484e-91d1-4345-baad-9f39f49a3915-cni-binary-copy\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691959 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691975 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-var-lib-openvswitch\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.691978 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-host-var-lib-cni-multus\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692030 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-config\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692215 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f6484e-91d1-4345-baad-9f39f49a3915-etc-kubernetes\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692798 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692836 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692857 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692877 4819 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692895 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692914 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692932 4819 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692949 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692968 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.692986 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693004 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693023 4819 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693042 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693061 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693079 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693157 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693187 4819 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693206 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693221 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693272 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693309 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693352 4819 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693819 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-env-overrides\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.693833 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-script-lib\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694830 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694877 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694888 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694899 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694908 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694938 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694948 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694958 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694967 4819 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694978 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.694990 4819 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695025 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695035 4819 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695044 4819 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695053 4819 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695062 4819 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695072 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695079 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caffcb28-383d-4424-a641-7dd1f36080c8-ovn-node-metrics-cert\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695102 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695113 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695122 4819 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695132 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695141 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695150 4819 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695177 4819 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695190 4819 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695200 4819 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695208 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695217 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695226 4819 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695235 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695273 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695325 4819 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695357 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695367 4819 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695376 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695386 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695394 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695403 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695412 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695421 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695430 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695438 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695447 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695455 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695463 4819 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695471 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695481 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695490 4819 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695498 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695509 4819 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695518 4819 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695526 4819 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695535 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695543 4819 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695552 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695560 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695568 4819 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695579 4819 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695588 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695596 4819 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695604 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695613 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695621 4819 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695631 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695642 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695650 4819 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695658 4819 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695667 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695675 4819 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695683 4819 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695691 4819 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695699 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695708 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695716 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695724 4819 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695733 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695741 4819 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695750 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695760 4819 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695769 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695779 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695788 4819 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695798 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695808 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695816 4819 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695825 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695834 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695843 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695852 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695861 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695869 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695878 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695886 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695895 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695905 4819 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695915 4819 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695924 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695944 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695956 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695966 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695975 4819 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695984 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.695993 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696002 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696012 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696021 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696029 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696039 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696047 4819 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696055 4819 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696064 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696075 4819 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696084 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696092 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696101 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696112 4819 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696121 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696130 4819 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696139 4819 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696149 4819 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696158 4819 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696165 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696174 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696182 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696190 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696198 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696208 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696217 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696226 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696235 4819 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696258 4819 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696267 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696276 4819 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696284 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696293 4819 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696301 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696309 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696318 4819 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696326 4819 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696335 4819 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696344 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696352 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696361 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696370 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696378 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696387 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696395 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696405 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696414 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696423 4819 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696432 4819 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696440 4819 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696452 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696460 4819 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696469 4819 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696477 4819 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696485 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696493 4819 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696502 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696511 4819 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696520 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696529 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696554 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696563 4819 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696571 4819 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696580 4819 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696589 4819 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.696597 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.697004 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-proxy-tls\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.706119 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9lwq\" (UniqueName: \"kubernetes.io/projected/caffcb28-383d-4424-a641-7dd1f36080c8-kube-api-access-h9lwq\") pod \"ovnkube-node-njv8f\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.708145 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mgf\" (UniqueName: \"kubernetes.io/projected/d6ad11c1-0eb7-4064-bb39-3ffb389efb90-kube-api-access-88mgf\") pod \"machine-config-daemon-rw4hn\" (UID: \"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\") " pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.709236 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76t9\" (UniqueName: \"kubernetes.io/projected/78f6484e-91d1-4345-baad-9f39f49a3915-kube-api-access-v76t9\") pod \"multus-5ldpg\" (UID: \"78f6484e-91d1-4345-baad-9f39f49a3915\") " pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.713377 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjp4\" (UniqueName: \"kubernetes.io/projected/e47759ba-9f0e-4aba-b3cf-dc4142c02f41-kube-api-access-pbjp4\") pod \"multus-additional-cni-plugins-b8c5l\" (UID: \"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\") " pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.713790 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgl7f\" (UniqueName: \"kubernetes.io/projected/ebdb39d5-8593-4a70-a0cd-c4701f9e58da-kube-api-access-fgl7f\") pod \"node-resolver-krp5h\" (UID: \"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\") " pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.767625 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.767655 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.767663 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.767684 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.767696 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.776921 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.792281 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-krp5h" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.807829 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.828670 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-krp5h" event={"ID":"ebdb39d5-8593-4a70-a0cd-c4701f9e58da","Type":"ContainerStarted","Data":"d76e001a50e83297406e06ad4acb5c8e2f16faf786f3700fc3884c592079e2ea"} Feb 28 03:36:00 crc kubenswrapper[4819]: W0228 03:36:00.831466 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3fbbfb29483015a2fb1e9744bf716ad514d9a5b1ec26aaf288293c72161e1e1c WatchSource:0}: Error finding container 3fbbfb29483015a2fb1e9744bf716ad514d9a5b1ec26aaf288293c72161e1e1c: Status 404 returned error can't find the container with id 3fbbfb29483015a2fb1e9744bf716ad514d9a5b1ec26aaf288293c72161e1e1c Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.831593 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e8dd8a3cc6bd059f49cebb8cbc5f69a37877a84e2039f1b33de68a40969e5e7a"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.832977 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:36:00 crc kubenswrapper[4819]: W0228 03:36:00.857032 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ad11c1_0eb7_4064_bb39_3ffb389efb90.slice/crio-d17c3e45904617ff2c0e0e49ebf6f9ee673b977049ac6d7e6d4b2423f93f0a15 WatchSource:0}: Error finding container d17c3e45904617ff2c0e0e49ebf6f9ee673b977049ac6d7e6d4b2423f93f0a15: Status 404 returned error can't find the container with id d17c3e45904617ff2c0e0e49ebf6f9ee673b977049ac6d7e6d4b2423f93f0a15 Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.870320 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.870358 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.870373 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.870396 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.870413 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.895126 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 28 03:36:00 crc kubenswrapper[4819]: W0228 03:36:00.912793 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c8aa50acad9d52db74ce0366ad529976628a2f743b58e286a3090b10b5e5369e WatchSource:0}: Error finding container c8aa50acad9d52db74ce0366ad529976628a2f743b58e286a3090b10b5e5369e: Status 404 returned error can't find the container with id c8aa50acad9d52db74ce0366ad529976628a2f743b58e286a3090b10b5e5369e Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.913989 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5ldpg" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.930457 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.936448 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.972722 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.972754 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.972765 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.972782 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:00 crc kubenswrapper[4819]: I0228 03:36:00.972793 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:00Z","lastTransitionTime":"2026-02-28T03:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:00 crc kubenswrapper[4819]: W0228 03:36:00.986585 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode47759ba_9f0e_4aba_b3cf_dc4142c02f41.slice/crio-12ac8fc27e67af3cc352081fd77f71be5662022cb344f9d52b3c2279ef7d6543 WatchSource:0}: Error finding container 12ac8fc27e67af3cc352081fd77f71be5662022cb344f9d52b3c2279ef7d6543: Status 404 returned error can't find the container with id 12ac8fc27e67af3cc352081fd77f71be5662022cb344f9d52b3c2279ef7d6543 Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.077641 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.077681 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.077693 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.077749 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.077763 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.099971 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100120 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:36:02.100087687 +0000 UTC m=+100.565656545 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.100202 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.100293 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.100434 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100480 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100510 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100529 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100549 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100577 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100620 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:02.100594598 +0000 UTC m=+100.566163466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100697 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:02.100637079 +0000 UTC m=+100.566205977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.100732 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:02.100717681 +0000 UTC m=+100.566286569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.182171 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.182224 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.182233 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.182262 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.182272 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.201582 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.201746 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.201768 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.201781 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:01 crc kubenswrapper[4819]: E0228 03:36:01.201832 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:02.201818466 +0000 UTC m=+100.667387324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.285053 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.285093 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.285102 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.285117 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.285127 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.388283 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.388329 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.388340 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.388357 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.388366 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.490856 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.490922 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.490944 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.490974 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.490992 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.594633 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.594707 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.594726 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.594753 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.594778 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.698239 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.698300 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.698313 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.698334 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.698347 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.801276 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.801338 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.801352 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.801369 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.801382 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.836989 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.837056 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.837071 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3fbbfb29483015a2fb1e9744bf716ad514d9a5b1ec26aaf288293c72161e1e1c"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.838365 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-krp5h" event={"ID":"ebdb39d5-8593-4a70-a0cd-c4701f9e58da","Type":"ContainerStarted","Data":"fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.839730 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.841154 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" exitCode=0 Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.841224 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.841262 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"e3e2406e1911f02504bc0a19e1dcf9519fbe19f9ffdff9a439c32bd9b86f3f19"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.843988 4819 generic.go:334] "Generic (PLEG): container finished" podID="e47759ba-9f0e-4aba-b3cf-dc4142c02f41" containerID="e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e" exitCode=0 Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.844094 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerDied","Data":"e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.844137 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerStarted","Data":"12ac8fc27e67af3cc352081fd77f71be5662022cb344f9d52b3c2279ef7d6543"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.845864 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerStarted","Data":"3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.845923 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerStarted","Data":"120b0e652775d86a84d0d410c99603ec99710447fffe232fdcfb5a77cd4565b6"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.846966 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c8aa50acad9d52db74ce0366ad529976628a2f743b58e286a3090b10b5e5369e"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.848780 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.848806 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.848816 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"d17c3e45904617ff2c0e0e49ebf6f9ee673b977049ac6d7e6d4b2423f93f0a15"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.859670 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.869943 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.890329 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.904772 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.904821 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.904842 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.904863 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.904879 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:01Z","lastTransitionTime":"2026-02-28T03:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.908979 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.938096 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.955833 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.973906 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:01 crc kubenswrapper[4819]: I0228 03:36:01.988275 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.008817 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.008870 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.008881 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.008902 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.008915 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.009825 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.025782 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.040867 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.059402 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.076041 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.095495 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.110764 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.110883 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.110950 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.111019 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111184 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111221 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111241 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111339 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:04.111315412 +0000 UTC m=+102.576884310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111563 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111638 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:04.111618529 +0000 UTC m=+102.577187497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111720 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:36:04.111710981 +0000 UTC m=+102.577279839 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111785 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.111837 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:04.111821903 +0000 UTC m=+102.577390811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.113173 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.116472 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.116499 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.116511 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.116528 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.116541 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.128826 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.142936 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.162725 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.180957 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.195959 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.212449 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.212600 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.212621 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.212634 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.212690 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:04.212674453 +0000 UTC m=+102.678243311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.212892 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.218994 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.219026 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.219037 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.219056 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.219068 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.227134 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.321786 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.321819 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.321830 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.321846 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.321858 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.370204 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.370452 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.371501 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.371636 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.371742 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.371812 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.383693 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.387101 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.387873 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.388783 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.389871 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.390094 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.391942 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.394354 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.395648 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.397013 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.399423 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.400937 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.403108 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.404527 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.406486 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.407168 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.407929 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.409207 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.409943 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.411220 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.411812 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.412563 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.413993 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.415199 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.415207 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.415952 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.417073 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.417973 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.419144 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.419946 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.421301 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.422097 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.422920 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.424941 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.425596 4819 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.425736 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.426179 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.426271 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.426289 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.426315 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.426332 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.428669 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.429495 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.430152 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.432835 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.433780 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.434958 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.435827 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.437202 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.437863 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.438920 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.440360 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.441062 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.441220 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.442187 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.442736 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.443688 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.444478 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.445410 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.445893 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.446469 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.447316 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.447838 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.448647 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.449092 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.452691 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.467915 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.497872 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.509290 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.524134 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.534045 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.534072 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.534081 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.534095 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.534104 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.546400 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.565614 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.583184 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.636461 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.636814 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.636945 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.637065 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.637314 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.741190 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.741588 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.741606 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.741628 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.741645 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.843516 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.843545 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.843554 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.843567 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.843577 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.852543 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerStarted","Data":"c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.854988 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.855051 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.855695 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:36:02 crc kubenswrapper[4819]: E0228 03:36:02.855832 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.869540 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.885704 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.900813 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.925856 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.945227 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.945880 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.946022 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.946131 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.946229 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.946357 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:02Z","lastTransitionTime":"2026-02-28T03:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.964955 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.978274 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:02 crc kubenswrapper[4819]: I0228 03:36:02.990706 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.008802 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.020810 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.033795 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.049478 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.049532 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.049540 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.049555 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.049586 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.057216 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.152597 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.152640 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.152652 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.152670 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.152682 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.254874 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.254923 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.254939 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.254956 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.254969 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.356903 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.356941 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.356949 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.356964 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.356981 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.459231 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.459335 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.459354 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.459378 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.459395 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.562470 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.562508 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.562520 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.562536 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.562548 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.665192 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.665282 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.665302 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.665325 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.665343 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.768345 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.768412 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.768431 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.768456 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.768474 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.864599 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.864664 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.864688 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.864711 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.867881 4819 generic.go:334] "Generic (PLEG): container finished" podID="e47759ba-9f0e-4aba-b3cf-dc4142c02f41" containerID="c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8" exitCode=0 Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.867953 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerDied","Data":"c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.870804 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.870858 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.870877 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.870898 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.870915 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.882904 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.898998 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.928886 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.945519 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.963676 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.972934 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.972990 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.973007 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.973032 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.973050 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:03Z","lastTransitionTime":"2026-02-28T03:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:03 crc kubenswrapper[4819]: I0228 03:36:03.978258 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:03.999833 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:03Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.018234 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.033707 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.054971 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.074747 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.074781 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.074794 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.074810 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.074822 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.082388 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.099161 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.131457 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131603 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:36:08.131578545 +0000 UTC m=+106.597147403 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.131665 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.131711 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.131797 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131819 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131838 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131852 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131892 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131906 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:08.131890862 +0000 UTC m=+106.597459730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131910 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131930 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:08.131920223 +0000 UTC m=+106.597489211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.131956 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:08.131945463 +0000 UTC m=+106.597514441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.177052 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.177117 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.177137 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.177162 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.177182 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.232438 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.232665 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.232704 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.232726 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.232795 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:08.232773242 +0000 UTC m=+106.698342140 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.280812 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.280867 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.280884 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.280907 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.280924 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.368174 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.368321 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.368806 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.368331 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.368927 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:04 crc kubenswrapper[4819]: E0228 03:36:04.369089 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.384569 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.384630 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.384647 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.384675 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.384692 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.487821 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.487877 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.487895 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.487922 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.487940 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.563284 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-q5btw"] Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.563855 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.566479 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.567069 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.567791 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.569218 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.590451 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.590508 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.590526 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.590553 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.590573 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.592786 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.613134 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.632139 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.636913 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-host\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.637010 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-serviceca\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.637114 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbsx\" (UniqueName: \"kubernetes.io/projected/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-kube-api-access-lwbsx\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.657939 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.677499 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.693389 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.693448 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.693465 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.693491 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.693509 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.699223 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.718698 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.738303 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.738441 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-host\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.738494 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-serviceca\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.738535 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbsx\" (UniqueName: \"kubernetes.io/projected/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-kube-api-access-lwbsx\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.738583 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-host\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.740137 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-serviceca\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.755316 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.772849 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbsx\" (UniqueName: \"kubernetes.io/projected/2f29dfe8-c6ab-429e-8ed5-3ca9be724486-kube-api-access-lwbsx\") pod \"node-ca-q5btw\" (UID: \"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\") " pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.777220 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.793074 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.796523 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.796594 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.796616 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.796642 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.796662 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.814282 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.846333 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.875448 4819 generic.go:334] "Generic (PLEG): container finished" podID="e47759ba-9f0e-4aba-b3cf-dc4142c02f41" containerID="bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c" exitCode=0 Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.875533 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerDied","Data":"bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.879206 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b"} Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.884376 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q5btw" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.899470 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.900116 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.900343 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.900588 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.901565 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.901603 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:04Z","lastTransitionTime":"2026-02-28T03:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:04 crc kubenswrapper[4819]: W0228 03:36:04.913859 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f29dfe8_c6ab_429e_8ed5_3ca9be724486.slice/crio-e9759b98df0b13005b29b6559fcd3bc5cf65c4bef2dbc89773d4f073471f99ee WatchSource:0}: Error finding container e9759b98df0b13005b29b6559fcd3bc5cf65c4bef2dbc89773d4f073471f99ee: Status 404 returned error can't find the container with id e9759b98df0b13005b29b6559fcd3bc5cf65c4bef2dbc89773d4f073471f99ee Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.919998 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.942043 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.959393 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.973273 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:04 crc kubenswrapper[4819]: I0228 03:36:04.990202 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:04Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.005193 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.005293 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.005320 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.005354 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.005379 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.006157 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.018006 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.033092 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.047576 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.065433 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.080114 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.095403 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.112675 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.115996 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.116092 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.116106 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.116637 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.116676 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.129180 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.147534 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.163679 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.178013 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.192200 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.204266 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.218393 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.218454 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.218484 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.218495 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.218515 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.218526 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.245810 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.262119 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.273841 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.284164 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.334361 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.335789 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.335811 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.335820 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.335834 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.335843 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.441995 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.442048 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.442090 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.442116 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.442136 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.545300 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.545369 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.545387 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.545412 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.545429 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.648350 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.648422 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.648444 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.648469 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.648488 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.752134 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.752194 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.752212 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.752239 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.752291 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.855861 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.855938 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.855961 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.855993 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.856018 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.887897 4819 generic.go:334] "Generic (PLEG): container finished" podID="e47759ba-9f0e-4aba-b3cf-dc4142c02f41" containerID="6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c" exitCode=0 Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.887995 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerDied","Data":"6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.891747 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q5btw" event={"ID":"2f29dfe8-c6ab-429e-8ed5-3ca9be724486","Type":"ContainerStarted","Data":"7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.891835 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q5btw" event={"ID":"2f29dfe8-c6ab-429e-8ed5-3ca9be724486","Type":"ContainerStarted","Data":"e9759b98df0b13005b29b6559fcd3bc5cf65c4bef2dbc89773d4f073471f99ee"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.902170 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.911861 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.936234 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.955674 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.958353 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.958398 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.958416 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.958441 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.958460 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:05Z","lastTransitionTime":"2026-02-28T03:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.971936 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.987703 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:05 crc kubenswrapper[4819]: I0228 03:36:05.999829 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:05Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.027755 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.040644 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.057711 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.062521 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.062578 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.062596 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.062622 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.062641 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.070874 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.091295 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.106200 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.124200 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.142415 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.159697 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.164842 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.164874 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.164882 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.164894 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.164908 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.176556 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.189512 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.204446 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.220862 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.249905 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.268090 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.268136 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.268153 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.268176 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.268194 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.277316 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.298520 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.316481 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.338338 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.357814 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.368089 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.368150 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.368201 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:06 crc kubenswrapper[4819]: E0228 03:36:06.368326 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:06 crc kubenswrapper[4819]: E0228 03:36:06.368452 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:06 crc kubenswrapper[4819]: E0228 03:36:06.368670 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.370547 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.370642 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.370670 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.370702 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.370728 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.378976 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.473823 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.473883 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.473900 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.473926 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.473944 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.577076 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.577126 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.577144 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.577167 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.577185 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.680537 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.680612 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.680631 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.680657 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.680677 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.783042 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.783145 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.783169 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.783193 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.783213 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.886422 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.886480 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.886497 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.886519 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.886562 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.912721 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerStarted","Data":"a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.934535 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.952570 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.973478 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.989718 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.989768 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.989787 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.989811 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.989833 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:06Z","lastTransitionTime":"2026-02-28T03:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:06 crc kubenswrapper[4819]: I0228 03:36:06.993543 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:06Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.012055 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.027198 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.043007 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.061358 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.092782 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.092825 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.092835 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.092857 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.092869 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.095126 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.115037 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.134516 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.152487 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.180816 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.195766 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.195832 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.195850 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.195876 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.195894 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.297918 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.297964 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.297976 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.298009 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.298026 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.401293 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.401360 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.401378 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.401406 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.401427 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.503860 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.503923 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.503941 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.503970 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.503989 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.609541 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.609847 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.609871 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.609901 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.609924 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.712721 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.712786 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.712804 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.712827 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.712844 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.816116 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.816181 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.816199 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.816223 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.816241 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.918304 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.918351 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.918370 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.918392 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.918409 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:07Z","lastTransitionTime":"2026-02-28T03:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.924604 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.924869 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.925016 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.925277 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.929825 4819 generic.go:334] "Generic (PLEG): container finished" podID="e47759ba-9f0e-4aba-b3cf-dc4142c02f41" containerID="a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0" exitCode=0 Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.929866 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerDied","Data":"a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0"} Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.956243 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:07Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.997119 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:07 crc kubenswrapper[4819]: I0228 03:36:07.998352 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.003831 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.023491 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.024219 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.024282 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.024298 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.024318 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.024332 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.041608 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.055975 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.073084 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.086329 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.098152 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.119290 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.126696 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.126748 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.126763 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.126788 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.126805 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.133469 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.170257 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.188303 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.193654 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.193772 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.193839 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:36:16.19381533 +0000 UTC m=+114.659384188 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.193898 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.193911 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.193968 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.193920 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.194024 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.193968 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.194100 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:16.194083186 +0000 UTC m=+114.659652044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.194130 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:16.194108607 +0000 UTC m=+114.659677475 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.194044 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.194287 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:16.19425675 +0000 UTC m=+114.659825618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.212141 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.224402 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.228381 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.228415 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.228424 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.228438 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.228446 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.235181 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.249825 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.260528 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.272433 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.284554 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.295257 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.295359 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.295385 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.295396 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.295439 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:16.295426307 +0000 UTC m=+114.760995165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.296689 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.307318 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.322617 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.330874 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.330907 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.330920 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.330939 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.330951 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.338959 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.349091 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.362592 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.368699 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.368767 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.368823 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.368963 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.369081 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:08 crc kubenswrapper[4819]: E0228 03:36:08.369306 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.392556 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.433435 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.433484 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.433495 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.433511 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.433523 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.536536 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.536627 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.536653 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.536684 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.536711 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.639964 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.640019 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.640036 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.640061 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.640078 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.743188 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.743314 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.743346 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.743377 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.743401 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.847482 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.847547 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.847570 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.847598 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.847623 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.939130 4819 generic.go:334] "Generic (PLEG): container finished" podID="e47759ba-9f0e-4aba-b3cf-dc4142c02f41" containerID="180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926" exitCode=0 Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.939271 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerDied","Data":"180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.950115 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.950171 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.950189 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.950213 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.950231 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:08Z","lastTransitionTime":"2026-02-28T03:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.964625 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:08 crc kubenswrapper[4819]: I0228 03:36:08.989422 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:08Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.011167 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.027606 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.048896 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.053768 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.053797 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.053810 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.053826 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.053838 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.063497 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.075095 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.090523 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.118279 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.132559 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.146998 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.156285 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.156330 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.156342 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.156359 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.156371 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.158877 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.172389 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.259044 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.259087 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.259098 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.259113 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.259129 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.362847 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.362898 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.362916 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.362938 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.362956 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.465131 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.465181 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.465193 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.465210 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.465224 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.567676 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.567725 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.567737 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.567755 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.567769 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.670690 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.670777 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.670805 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.670836 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.670859 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.773746 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.773806 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.773824 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.773848 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.773885 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.876920 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.876984 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.877003 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.877029 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.877049 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.947903 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" event={"ID":"e47759ba-9f0e-4aba-b3cf-dc4142c02f41","Type":"ContainerStarted","Data":"19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.968615 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.980190 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.980286 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.980305 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.980333 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.980353 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:09Z","lastTransitionTime":"2026-02-28T03:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:09 crc kubenswrapper[4819]: I0228 03:36:09.989496 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:09Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.004885 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.025938 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.046314 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.061281 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.074337 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.083707 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.083767 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.083785 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.083812 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.083831 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.090128 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.120913 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.144004 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.162240 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.181999 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.188439 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.188502 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.188522 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.188547 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.188565 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.224083 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.290557 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.290787 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.290845 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.290901 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.290953 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.368754 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:10 crc kubenswrapper[4819]: E0228 03:36:10.368927 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.369114 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:10 crc kubenswrapper[4819]: E0228 03:36:10.369284 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.369405 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:10 crc kubenswrapper[4819]: E0228 03:36:10.369515 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.392812 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.392943 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.392997 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.393079 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.393139 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.495358 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.495403 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.495416 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.495433 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.495447 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.597922 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.597983 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.598001 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.598027 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.598046 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.680909 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p"] Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.681845 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.685169 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.685361 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.700679 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.700718 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.700730 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.700747 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.700761 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.702108 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.715976 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.732323 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.749410 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.764882 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.782020 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.803446 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.803755 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.803938 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.804091 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.804223 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.815357 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.830824 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjjmn\" (UniqueName: \"kubernetes.io/projected/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-kube-api-access-hjjmn\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.831078 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.831321 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.831512 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.834212 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.851887 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.862605 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.884678 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.901420 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.906779 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.906815 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.906827 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.906846 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.906858 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.918591 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.933186 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.933279 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.933336 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.933387 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjjmn\" (UniqueName: \"kubernetes.io/projected/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-kube-api-access-hjjmn\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.934063 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.934390 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.940307 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.940230 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.967176 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjjmn\" (UniqueName: \"kubernetes.io/projected/ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3-kube-api-access-hjjmn\") pod \"ovnkube-control-plane-749d76644c-8nk8p\" (UID: \"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.982193 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.982235 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.982280 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.982303 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.982320 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:10Z","lastTransitionTime":"2026-02-28T03:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:10 crc kubenswrapper[4819]: E0228 03:36:10.997117 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:10Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:10 crc kubenswrapper[4819]: I0228 03:36:10.999284 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.002929 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.002979 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.002996 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.003025 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.003047 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: W0228 03:36:11.017635 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad04ba9d_eb5c_422e_bf1b_8f6dee9399d3.slice/crio-4c86960056451d975690c6e2894e4013c46c0bdcb67b14d67108388bbcf36c70 WatchSource:0}: Error finding container 4c86960056451d975690c6e2894e4013c46c0bdcb67b14d67108388bbcf36c70: Status 404 returned error can't find the container with id 4c86960056451d975690c6e2894e4013c46c0bdcb67b14d67108388bbcf36c70 Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.028855 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.034079 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.034154 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.034176 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.034215 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.034235 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.070803 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.077843 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.077899 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.077917 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.077940 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.077959 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.096714 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.101728 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.101794 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.101812 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.101836 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.101857 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.118283 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.118516 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.120160 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.120212 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.120229 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.120276 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.120296 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.223538 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.223596 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.223614 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.223637 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.223654 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.328686 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.329197 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.329296 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.329328 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.329384 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.434596 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.434670 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.434696 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.434728 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.434751 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.435924 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lbrtr"] Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.436725 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.436819 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.457033 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.478071 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.499013 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.517387 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.538308 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.538368 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.538382 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.538400 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.538412 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.542392 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.542470 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ldl\" (UniqueName: \"kubernetes.io/projected/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-kube-api-access-82ldl\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.542688 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.567653 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.588351 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.616106 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.636334 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.641043 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.641092 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.641107 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.641127 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.641141 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.643121 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.643269 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82ldl\" (UniqueName: \"kubernetes.io/projected/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-kube-api-access-82ldl\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.643332 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:11 crc kubenswrapper[4819]: E0228 03:36:11.643403 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:12.143382128 +0000 UTC m=+110.608950996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.651216 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.661953 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ldl\" (UniqueName: \"kubernetes.io/projected/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-kube-api-access-82ldl\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.665242 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.679800 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.690070 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.707749 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.730816 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.743746 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.743853 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.743878 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.743909 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.743940 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.846995 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.847056 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.847073 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.847097 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.847115 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.950824 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.950903 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.950926 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.950958 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.950985 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:11Z","lastTransitionTime":"2026-02-28T03:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.963932 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/0.log" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.969065 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa" exitCode=1 Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.969177 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.971461 4819 scope.go:117] "RemoveContainer" containerID="3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa" Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.972274 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" event={"ID":"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3","Type":"ContainerStarted","Data":"5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.972340 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" event={"ID":"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3","Type":"ContainerStarted","Data":"dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0"} Feb 28 03:36:11 crc kubenswrapper[4819]: I0228 03:36:11.972363 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" event={"ID":"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3","Type":"ContainerStarted","Data":"4c86960056451d975690c6e2894e4013c46c0bdcb67b14d67108388bbcf36c70"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.001033 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:11Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.022594 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.041179 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.058447 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.058488 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.058504 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.058528 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.058548 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.058676 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.076797 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.100426 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.135889 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.066642 6603 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.067466 6603 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0228 03:36:11.067537 6603 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:11.067547 6603 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:11.067568 6603 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:11.067575 6603 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:11.067595 6603 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0228 03:36:11.067627 6603 factory.go:656] Stopping watch factory\\\\nI0228 03:36:11.067639 6603 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:11.067648 6603 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:11.067653 6603 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0228 03:36:11.067694 6603 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.149889 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:12 crc kubenswrapper[4819]: E0228 03:36:12.150125 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:12 crc kubenswrapper[4819]: E0228 03:36:12.150203 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:13.150179532 +0000 UTC m=+111.615748430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.159657 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.168898 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.169114 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.169228 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.169418 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.169556 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.183307 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.199815 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.224395 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.240044 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.254611 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.272269 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.272992 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.273049 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.273068 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.273093 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.273111 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.283981 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.301906 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.318868 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.335734 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.348387 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.361623 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.368859 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.368996 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.369050 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:12 crc kubenswrapper[4819]: E0228 03:36:12.369044 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:12 crc kubenswrapper[4819]: E0228 03:36:12.369238 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:12 crc kubenswrapper[4819]: E0228 03:36:12.369435 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.376211 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.376346 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.376410 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.376470 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.376543 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.381198 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.412027 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.066642 6603 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.067466 6603 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0228 03:36:11.067537 6603 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:11.067547 6603 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:11.067568 6603 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:11.067575 6603 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:11.067595 6603 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0228 03:36:11.067627 6603 factory.go:656] Stopping watch factory\\\\nI0228 03:36:11.067639 6603 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:11.067648 6603 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:11.067653 6603 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0228 03:36:11.067694 6603 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.431383 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.449451 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.470288 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.479560 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.479621 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.479650 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.479685 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.479708 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.491318 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.512763 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.535650 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.559034 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.577911 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.582899 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.582968 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.582992 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.583024 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.583041 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.599998 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.622618 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.654161 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.066642 6603 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.067466 6603 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0228 03:36:11.067537 6603 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:11.067547 6603 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:11.067568 6603 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:11.067575 6603 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:11.067595 6603 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0228 03:36:11.067627 6603 factory.go:656] Stopping watch factory\\\\nI0228 03:36:11.067639 6603 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:11.067648 6603 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:11.067653 6603 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0228 03:36:11.067694 6603 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.675662 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.686142 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.686171 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.686186 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.686208 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.686225 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.694759 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.711758 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.725414 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.743889 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.761682 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.780385 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.788838 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.788868 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.788878 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.788894 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.788906 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.794545 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.813855 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.835474 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.854574 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.869380 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.891717 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.891766 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.891777 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.891796 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.891808 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:12Z","lastTransitionTime":"2026-02-28T03:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.980627 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/0.log" Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.992894 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04"} Feb 28 03:36:12 crc kubenswrapper[4819]: I0228 03:36:12.993515 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.000683 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.000750 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.000775 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.000806 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.000829 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.024038 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.066642 6603 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.067466 6603 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0228 03:36:11.067537 6603 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:11.067547 6603 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:11.067568 6603 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:11.067575 6603 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:11.067595 6603 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0228 03:36:11.067627 6603 factory.go:656] Stopping watch factory\\\\nI0228 03:36:11.067639 6603 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:11.067648 6603 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:11.067653 6603 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0228 03:36:11.067694 6603 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.034295 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.053550 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.070051 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.091999 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.102971 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.103020 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.103034 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.103056 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.103071 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.108802 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.125059 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.138675 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.151812 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.160416 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:13 crc kubenswrapper[4819]: E0228 03:36:13.160605 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:13 crc kubenswrapper[4819]: E0228 03:36:13.160661 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:15.160646101 +0000 UTC m=+113.626214979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.163913 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.178074 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.196105 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.205432 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.205533 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.205558 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.205583 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.205603 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.215820 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.232386 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.247009 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.308671 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.308728 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.308749 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.308773 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.308792 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.368386 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:13 crc kubenswrapper[4819]: E0228 03:36:13.368624 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.411718 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.411928 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.412046 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.412163 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.412328 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.515815 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.515876 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.515894 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.515919 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.515937 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.619108 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.619189 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.619213 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.619273 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.619296 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.722344 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.722390 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.722406 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.722429 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.722448 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.825564 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.825621 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.825639 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.825662 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.825680 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.928494 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.928545 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.928561 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.928585 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.928601 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:13Z","lastTransitionTime":"2026-02-28T03:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:13 crc kubenswrapper[4819]: I0228 03:36:13.999584 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/1.log" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.000757 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/0.log" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.005453 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04" exitCode=1 Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.005504 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.005550 4819 scope.go:117] "RemoveContainer" containerID="3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.006835 4819 scope.go:117] "RemoveContainer" containerID="89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04" Feb 28 03:36:14 crc kubenswrapper[4819]: E0228 03:36:14.007151 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.029129 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.031464 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.031523 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.031544 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.031575 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.031595 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.051334 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.084853 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a53308e4f2baf1a7de543a5d4771cae889fdbd2abe9ad6d3bc36b44a99223aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.066642 6603 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:11.067466 6603 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0228 03:36:11.067537 6603 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:11.067547 6603 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:11.067568 6603 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:11.067575 6603 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:11.067595 6603 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0228 03:36:11.067627 6603 factory.go:656] Stopping watch factory\\\\nI0228 03:36:11.067639 6603 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:11.067648 6603 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:11.067653 6603 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0228 03:36:11.067673 6603 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0228 03:36:11.067694 6603 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.111234 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.130365 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.134964 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.135007 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.135024 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.135049 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.135066 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.152319 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.171400 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.192803 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.215930 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.237973 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.239284 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.239384 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.239403 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.239427 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.239445 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.257292 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.298963 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.339853 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.342739 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.342772 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.342782 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.342798 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.342810 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.357110 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.368785 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.368835 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.368859 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:14 crc kubenswrapper[4819]: E0228 03:36:14.368938 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:14 crc kubenswrapper[4819]: E0228 03:36:14.369091 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:14 crc kubenswrapper[4819]: E0228 03:36:14.369361 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.371205 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:14Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.445944 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.446000 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.446018 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.446040 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.446057 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.548568 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.548623 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.548641 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.548663 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.548681 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.652087 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.652146 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.652165 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.652186 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.652204 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.754955 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.755010 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.755026 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.755051 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.755070 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.858157 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.858219 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.858240 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.858288 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.858305 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.960741 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.960798 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.960815 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.960838 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:14 crc kubenswrapper[4819]: I0228 03:36:14.960860 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:14Z","lastTransitionTime":"2026-02-28T03:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.011762 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/1.log" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.016911 4819 scope.go:117] "RemoveContainer" containerID="89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04" Feb 28 03:36:15 crc kubenswrapper[4819]: E0228 03:36:15.017157 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.036841 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.055660 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.064065 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.064112 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.064132 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.064155 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.064173 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.073956 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.096328 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.114335 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.133119 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.147858 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.167086 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.167143 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.167160 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.167185 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.167210 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.168440 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.188405 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.196657 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:15 crc kubenswrapper[4819]: E0228 03:36:15.196887 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:15 crc kubenswrapper[4819]: E0228 03:36:15.196978 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:19.196953726 +0000 UTC m=+117.662522624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.205614 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.219734 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.240885 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.256893 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.270028 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.270401 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.270592 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.270796 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.270990 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.277632 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.314860 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:15Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.368097 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:15 crc kubenswrapper[4819]: E0228 03:36:15.368340 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.374838 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.375149 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.375357 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.375500 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.375630 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.478712 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.478777 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.478795 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.478819 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.478838 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.582484 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.582552 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.582572 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.582597 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.582614 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.686348 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.687105 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.687134 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.687377 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.687400 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.790500 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.790569 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.790584 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.790610 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.790633 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.894041 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.894102 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.894122 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.894144 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.894163 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.997095 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.997157 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.997176 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.997201 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:15 crc kubenswrapper[4819]: I0228 03:36:15.997218 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:15Z","lastTransitionTime":"2026-02-28T03:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.099738 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.099800 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.099870 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.099897 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.099913 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.202815 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.202874 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.202894 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.202920 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.202943 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.205186 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.205355 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.205429 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205464 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:36:32.205431169 +0000 UTC m=+130.671000067 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.205533 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205565 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205612 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205631 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205661 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205706 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:32.205683544 +0000 UTC m=+130.671252432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205733 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:32.205721395 +0000 UTC m=+130.671290283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205749 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.205814 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:32.205798667 +0000 UTC m=+130.671367555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.306185 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.306311 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.306351 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.306367 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.306391 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.306409 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.306468 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.306502 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.306521 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.306612 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:32.306586635 +0000 UTC m=+130.772155523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.368390 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.368500 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.368559 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.368674 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.368851 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:16 crc kubenswrapper[4819]: E0228 03:36:16.368989 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.369885 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.409561 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.409724 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.409757 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.409790 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.409814 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.512546 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.512628 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.512655 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.512686 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.512708 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.616058 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.616112 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.616129 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.616153 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.616171 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.720899 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.720961 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.720981 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.721007 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.721023 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.823848 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.823883 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.823894 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.823910 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.823921 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.926646 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.926713 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.926733 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.926759 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:16 crc kubenswrapper[4819]: I0228 03:36:16.926783 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:16Z","lastTransitionTime":"2026-02-28T03:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.033009 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.033070 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.033088 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.033119 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.033138 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.035899 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.038159 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.038748 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.063423 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.084338 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.104752 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.124057 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.136485 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.136539 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.136556 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.136581 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.136598 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.142576 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.163775 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.184360 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.200554 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.225164 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.239374 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.239639 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.239748 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.239871 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.239963 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.247718 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.270411 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.288107 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.304560 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.325027 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.342142 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.342190 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.342219 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.342238 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.342278 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.355495 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:17Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.368681 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:17 crc kubenswrapper[4819]: E0228 03:36:17.368825 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.445116 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.445177 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.445195 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.445220 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.445236 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.547933 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.547985 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.548001 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.548023 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.548040 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.650220 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.650711 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.650925 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.651117 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.651322 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.754274 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.754343 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.754362 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.754386 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.754406 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.857345 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.857408 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.857425 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.857449 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.857469 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.960794 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.960926 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.960952 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.960980 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:17 crc kubenswrapper[4819]: I0228 03:36:17.961000 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:17Z","lastTransitionTime":"2026-02-28T03:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.063868 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.063927 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.063944 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.063971 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.063990 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.166502 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.166569 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.166586 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.166611 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.166627 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.269430 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.269484 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.269502 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.269527 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.269547 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.368600 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.368641 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:18 crc kubenswrapper[4819]: E0228 03:36:18.368771 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.368819 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:18 crc kubenswrapper[4819]: E0228 03:36:18.368960 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:18 crc kubenswrapper[4819]: E0228 03:36:18.369116 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.372604 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.372650 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.372666 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.372689 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.372707 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.474887 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.474965 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.474983 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.475008 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.475026 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.578193 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.578278 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.578297 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.578321 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.578340 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.681780 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.681837 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.681855 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.681880 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.681897 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.785009 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.785103 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.785121 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.785146 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.785165 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.888053 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.888121 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.888140 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.888165 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.888183 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.993348 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.993399 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.993411 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.993429 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:18 crc kubenswrapper[4819]: I0228 03:36:18.993443 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:18Z","lastTransitionTime":"2026-02-28T03:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.095836 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.095924 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.095942 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.095968 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.095988 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.199057 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.199122 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.199140 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.199164 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.199181 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.240821 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:19 crc kubenswrapper[4819]: E0228 03:36:19.241096 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:19 crc kubenswrapper[4819]: E0228 03:36:19.241221 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:27.241193123 +0000 UTC m=+125.706762011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.301476 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.301536 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.301554 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.301579 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.301597 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.368240 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:19 crc kubenswrapper[4819]: E0228 03:36:19.368471 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.404112 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.404227 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.404275 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.404297 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.404313 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.506786 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.506860 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.506877 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.506918 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.506936 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.616410 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.616488 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.616517 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.616550 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.616574 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.720213 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.720419 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.720439 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.720466 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.720483 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.823385 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.823449 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.823470 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.823499 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.823518 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.927003 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.927071 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.927089 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.927115 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:19 crc kubenswrapper[4819]: I0228 03:36:19.927133 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:19Z","lastTransitionTime":"2026-02-28T03:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.029777 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.030071 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.030333 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.030559 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.030750 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.134364 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.134727 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.134898 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.135047 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.135172 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.238774 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.238832 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.238850 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.238876 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.238916 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.342310 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.342368 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.342385 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.342411 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.342429 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.373081 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.373102 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.373218 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:20 crc kubenswrapper[4819]: E0228 03:36:20.373440 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:20 crc kubenswrapper[4819]: E0228 03:36:20.373583 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:20 crc kubenswrapper[4819]: E0228 03:36:20.373748 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.444792 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.444886 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.444904 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.444926 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.444943 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.547991 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.548056 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.548074 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.548099 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.548118 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.651857 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.651919 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.651938 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.651964 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.651982 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.755037 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.755092 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.755104 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.755121 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.755132 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.857850 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.858142 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.858325 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.858484 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.858640 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.961367 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.961427 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.961444 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.961474 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:20 crc kubenswrapper[4819]: I0228 03:36:20.961492 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:20Z","lastTransitionTime":"2026-02-28T03:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.064236 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.064320 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.064338 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.064362 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.064380 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.167669 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.167731 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.167748 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.167771 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.167792 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.255426 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.255499 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.255517 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.255545 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.255566 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.275104 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:21Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.280175 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.280420 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.280560 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.280723 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.280872 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.299547 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:21Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.303297 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.303331 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.303340 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.303357 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.303367 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.323583 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:21Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.327871 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.327918 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.327930 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.327948 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.327967 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.343358 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:21Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.347348 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.347574 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.347718 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.347872 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.348001 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.364227 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:21Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.364523 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.366407 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.366455 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.366473 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.366497 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.366515 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.368627 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:21 crc kubenswrapper[4819]: E0228 03:36:21.368766 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.470059 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.470118 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.470138 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.470163 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.470182 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.572337 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.572388 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.572406 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.572428 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.572445 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.675205 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.675283 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.675304 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.675327 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.675345 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.778044 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.778098 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.778115 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.778138 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.778155 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.880812 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.880871 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.880887 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.880912 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.880930 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.983592 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.983648 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.983665 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.983693 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:21 crc kubenswrapper[4819]: I0228 03:36:21.983710 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:21Z","lastTransitionTime":"2026-02-28T03:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.086978 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.087022 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.087033 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.087049 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.087061 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:22Z","lastTransitionTime":"2026-02-28T03:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.190362 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.190408 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.190424 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.190447 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.190463 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:22Z","lastTransitionTime":"2026-02-28T03:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:22 crc kubenswrapper[4819]: E0228 03:36:22.291550 4819 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.367906 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.367947 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:22 crc kubenswrapper[4819]: E0228 03:36:22.368506 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.367916 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:22 crc kubenswrapper[4819]: E0228 03:36:22.368968 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:22 crc kubenswrapper[4819]: E0228 03:36:22.369317 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.389123 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.414649 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.432404 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.455204 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.474420 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.494119 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: E0228 03:36:22.502299 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.524726 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.542417 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.565186 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.585216 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.602216 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.621015 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.651517 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.666854 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:22 crc kubenswrapper[4819]: I0228 03:36:22.686557 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:22Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:23 crc kubenswrapper[4819]: I0228 03:36:23.368469 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:23 crc kubenswrapper[4819]: E0228 03:36:23.368661 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:24 crc kubenswrapper[4819]: I0228 03:36:24.368839 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:24 crc kubenswrapper[4819]: I0228 03:36:24.368925 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:24 crc kubenswrapper[4819]: E0228 03:36:24.368965 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:24 crc kubenswrapper[4819]: E0228 03:36:24.369103 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:24 crc kubenswrapper[4819]: I0228 03:36:24.369217 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:24 crc kubenswrapper[4819]: E0228 03:36:24.369346 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:25 crc kubenswrapper[4819]: I0228 03:36:25.368920 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:25 crc kubenswrapper[4819]: E0228 03:36:25.369124 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:26 crc kubenswrapper[4819]: I0228 03:36:26.368744 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:26 crc kubenswrapper[4819]: E0228 03:36:26.369605 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:26 crc kubenswrapper[4819]: I0228 03:36:26.368740 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:26 crc kubenswrapper[4819]: I0228 03:36:26.368829 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:26 crc kubenswrapper[4819]: E0228 03:36:26.369823 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:26 crc kubenswrapper[4819]: E0228 03:36:26.370125 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:27 crc kubenswrapper[4819]: I0228 03:36:27.334721 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:27 crc kubenswrapper[4819]: E0228 03:36:27.334890 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:27 crc kubenswrapper[4819]: E0228 03:36:27.335015 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:36:43.334986673 +0000 UTC m=+141.800555561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:27 crc kubenswrapper[4819]: I0228 03:36:27.368189 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:27 crc kubenswrapper[4819]: E0228 03:36:27.368420 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:27 crc kubenswrapper[4819]: E0228 03:36:27.504562 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:28 crc kubenswrapper[4819]: I0228 03:36:28.369501 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:28 crc kubenswrapper[4819]: I0228 03:36:28.369621 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:28 crc kubenswrapper[4819]: E0228 03:36:28.369692 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:28 crc kubenswrapper[4819]: E0228 03:36:28.369797 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:28 crc kubenswrapper[4819]: I0228 03:36:28.369617 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:28 crc kubenswrapper[4819]: E0228 03:36:28.369908 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.055954 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.076917 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.095290 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.116411 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.135107 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.153763 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.168393 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.198424 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.209712 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.227163 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.259161 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.278690 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.295794 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.311404 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.333032 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.346431 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:29Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.368184 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:29 crc kubenswrapper[4819]: E0228 03:36:29.368753 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:29 crc kubenswrapper[4819]: I0228 03:36:29.369158 4819 scope.go:117] "RemoveContainer" containerID="89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.086208 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/1.log" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.094823 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e"} Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.095578 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.108163 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.122669 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.150951 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.164509 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.181593 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.199785 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.215615 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.234627 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.247118 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.257904 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.270903 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.289571 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.306995 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.325237 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.340393 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:30Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.367844 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.367932 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:30 crc kubenswrapper[4819]: E0228 03:36:30.368000 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:30 crc kubenswrapper[4819]: E0228 03:36:30.368071 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:30 crc kubenswrapper[4819]: I0228 03:36:30.368085 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:30 crc kubenswrapper[4819]: E0228 03:36:30.368239 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.100898 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/2.log" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.102059 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/1.log" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.105601 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e" exitCode=1 Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.105650 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e"} Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.105715 4819 scope.go:117] "RemoveContainer" containerID="89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.106727 4819 scope.go:117] "RemoveContainer" containerID="f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e" Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.106987 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.128661 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.148550 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.171685 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.193125 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.219808 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.242445 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.261954 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.278515 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.309601 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.337025 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.361330 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.368642 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.368798 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.378437 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.395952 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.417105 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89ba7fa4e637110467746d05426996b125c1acbb48e04327974555e0b0332b04\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:13Z\\\",\\\"message\\\":\\\"OVN-Kubernetes controller took 0.104363729 seconds. No OVN measurement.\\\\nI0228 03:36:13.389115 6816 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver/apiserver]} name:Service_openshift-kube-apiserver/apiserver_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.93:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d71b38eb-32af-4c0f-9490-7c317c111e3a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:13.389160 6816 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0228 03:36:13.389167 6816 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0228 03:36:13.389210 6816 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF0228 03:36:13.389310 6816 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.431571 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.494764 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.494785 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.494793 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.494804 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.494814 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:31Z","lastTransitionTime":"2026-02-28T03:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.510983 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.514883 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.514905 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.514912 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.514921 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.514928 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:31Z","lastTransitionTime":"2026-02-28T03:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.529388 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.534513 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.534564 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.534577 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.534598 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.534617 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:31Z","lastTransitionTime":"2026-02-28T03:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.551624 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.556979 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.557023 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.557035 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.557060 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.557074 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:31Z","lastTransitionTime":"2026-02-28T03:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.577634 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.582442 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.582495 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.582513 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.582543 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:31 crc kubenswrapper[4819]: I0228 03:36:31.582563 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:31Z","lastTransitionTime":"2026-02-28T03:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.599568 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:31Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:31 crc kubenswrapper[4819]: E0228 03:36:31.599809 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.112902 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/2.log" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.118290 4819 scope.go:117] "RemoveContainer" containerID="f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.118435 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.136771 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.157741 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.177177 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.193864 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.216020 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.235682 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.256511 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.271364 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.284030 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284146 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:04.284127071 +0000 UTC m=+162.749695929 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.284292 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.284348 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284463 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284489 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284510 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284524 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284544 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:37:04.284523111 +0000 UTC m=+162.750092009 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284568 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:37:04.284556342 +0000 UTC m=+162.750125240 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.284569 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284753 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.284899 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:37:04.284864679 +0000 UTC m=+162.750433587 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.292228 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.309328 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.324496 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.340466 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.355976 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.367899 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.367961 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.368009 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.368075 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.368205 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.368393 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.379204 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.385544 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.385755 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.385780 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.385813 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.385881 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:37:04.385861166 +0000 UTC m=+162.851430064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.409333 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.428599 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.444213 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.465420 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.480723 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.502167 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: E0228 03:36:32.505296 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.522018 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.537898 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.550994 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.565857 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.586618 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.606095 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.621094 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.638353 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.663809 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:32 crc kubenswrapper[4819]: I0228 03:36:32.679486 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:32Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:33 crc kubenswrapper[4819]: I0228 03:36:33.368483 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:33 crc kubenswrapper[4819]: E0228 03:36:33.368682 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:34 crc kubenswrapper[4819]: I0228 03:36:34.367981 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:34 crc kubenswrapper[4819]: I0228 03:36:34.368187 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:34 crc kubenswrapper[4819]: E0228 03:36:34.368607 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:34 crc kubenswrapper[4819]: E0228 03:36:34.368440 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:34 crc kubenswrapper[4819]: I0228 03:36:34.368299 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:34 crc kubenswrapper[4819]: E0228 03:36:34.368755 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:34 crc kubenswrapper[4819]: I0228 03:36:34.387423 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 28 03:36:35 crc kubenswrapper[4819]: I0228 03:36:35.368124 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:35 crc kubenswrapper[4819]: E0228 03:36:35.368298 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:36 crc kubenswrapper[4819]: I0228 03:36:36.368832 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:36 crc kubenswrapper[4819]: E0228 03:36:36.369468 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:36 crc kubenswrapper[4819]: I0228 03:36:36.368961 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:36 crc kubenswrapper[4819]: E0228 03:36:36.369714 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:36 crc kubenswrapper[4819]: I0228 03:36:36.368932 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:36 crc kubenswrapper[4819]: E0228 03:36:36.370007 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:37 crc kubenswrapper[4819]: I0228 03:36:37.367878 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:37 crc kubenswrapper[4819]: E0228 03:36:37.368086 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:37 crc kubenswrapper[4819]: E0228 03:36:37.506737 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:38 crc kubenswrapper[4819]: I0228 03:36:38.368497 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:38 crc kubenswrapper[4819]: I0228 03:36:38.368567 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:38 crc kubenswrapper[4819]: I0228 03:36:38.368522 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:38 crc kubenswrapper[4819]: E0228 03:36:38.368673 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:38 crc kubenswrapper[4819]: E0228 03:36:38.368761 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:38 crc kubenswrapper[4819]: E0228 03:36:38.368832 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:39 crc kubenswrapper[4819]: I0228 03:36:39.368500 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:39 crc kubenswrapper[4819]: E0228 03:36:39.369474 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:40 crc kubenswrapper[4819]: I0228 03:36:40.368714 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:40 crc kubenswrapper[4819]: I0228 03:36:40.368886 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:40 crc kubenswrapper[4819]: E0228 03:36:40.368972 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:40 crc kubenswrapper[4819]: I0228 03:36:40.369016 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:40 crc kubenswrapper[4819]: E0228 03:36:40.369224 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:40 crc kubenswrapper[4819]: E0228 03:36:40.369483 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.368854 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.369430 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.805145 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.805191 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.805203 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.805218 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.805230 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:41Z","lastTransitionTime":"2026-02-28T03:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.824368 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:41Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.829739 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.829797 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.829814 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.829837 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.829855 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:41Z","lastTransitionTime":"2026-02-28T03:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.849954 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:41Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.854906 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.854961 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.854978 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.855004 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.855021 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:41Z","lastTransitionTime":"2026-02-28T03:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.875050 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:41Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.880152 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.880210 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.880222 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.880255 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.880455 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:41Z","lastTransitionTime":"2026-02-28T03:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.898989 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:41Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.903963 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.904017 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.904036 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.904060 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:41 crc kubenswrapper[4819]: I0228 03:36:41.904079 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:41Z","lastTransitionTime":"2026-02-28T03:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.922996 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:41Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:41 crc kubenswrapper[4819]: E0228 03:36:41.923223 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.368915 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.368981 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:42 crc kubenswrapper[4819]: E0228 03:36:42.369136 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.369234 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:42 crc kubenswrapper[4819]: E0228 03:36:42.369286 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:42 crc kubenswrapper[4819]: E0228 03:36:42.369474 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.386564 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.400013 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.415457 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.433483 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.446331 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.458453 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.478445 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.499396 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: E0228 03:36:42.507357 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.514860 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.532726 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.556684 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.576511 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.592420 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.610261 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.629466 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:42 crc kubenswrapper[4819]: I0228 03:36:42.646053 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:42Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:43 crc kubenswrapper[4819]: I0228 03:36:43.368185 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:43 crc kubenswrapper[4819]: E0228 03:36:43.368411 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:43 crc kubenswrapper[4819]: I0228 03:36:43.408539 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:43 crc kubenswrapper[4819]: E0228 03:36:43.408774 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:43 crc kubenswrapper[4819]: E0228 03:36:43.408866 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:37:15.408841108 +0000 UTC m=+173.874410006 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:36:44 crc kubenswrapper[4819]: I0228 03:36:44.368968 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:44 crc kubenswrapper[4819]: I0228 03:36:44.368985 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:44 crc kubenswrapper[4819]: I0228 03:36:44.369197 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:44 crc kubenswrapper[4819]: E0228 03:36:44.369191 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:44 crc kubenswrapper[4819]: E0228 03:36:44.369640 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:44 crc kubenswrapper[4819]: E0228 03:36:44.369771 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:44 crc kubenswrapper[4819]: I0228 03:36:44.369874 4819 scope.go:117] "RemoveContainer" containerID="f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e" Feb 28 03:36:44 crc kubenswrapper[4819]: E0228 03:36:44.370042 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:36:45 crc kubenswrapper[4819]: I0228 03:36:45.367904 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:45 crc kubenswrapper[4819]: E0228 03:36:45.368334 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:45 crc kubenswrapper[4819]: I0228 03:36:45.382670 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 28 03:36:46 crc kubenswrapper[4819]: I0228 03:36:46.368530 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:46 crc kubenswrapper[4819]: I0228 03:36:46.368530 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:46 crc kubenswrapper[4819]: I0228 03:36:46.368654 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:46 crc kubenswrapper[4819]: E0228 03:36:46.368804 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:46 crc kubenswrapper[4819]: E0228 03:36:46.369147 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:46 crc kubenswrapper[4819]: E0228 03:36:46.369436 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:47 crc kubenswrapper[4819]: I0228 03:36:47.368070 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:47 crc kubenswrapper[4819]: E0228 03:36:47.368287 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:47 crc kubenswrapper[4819]: E0228 03:36:47.508275 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:48 crc kubenswrapper[4819]: I0228 03:36:48.368640 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:48 crc kubenswrapper[4819]: I0228 03:36:48.368658 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:48 crc kubenswrapper[4819]: E0228 03:36:48.369304 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:48 crc kubenswrapper[4819]: I0228 03:36:48.368729 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:48 crc kubenswrapper[4819]: E0228 03:36:48.369600 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:48 crc kubenswrapper[4819]: E0228 03:36:48.369818 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.177146 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/0.log" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.177226 4819 generic.go:334] "Generic (PLEG): container finished" podID="78f6484e-91d1-4345-baad-9f39f49a3915" containerID="3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25" exitCode=1 Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.177297 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerDied","Data":"3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25"} Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.177856 4819 scope.go:117] "RemoveContainer" containerID="3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.199562 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.221330 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.238306 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.256543 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.277542 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.296326 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.315216 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.345911 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.364038 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.367903 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:49 crc kubenswrapper[4819]: E0228 03:36:49.368125 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.380142 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.405631 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.423752 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.448411 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.466061 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.481832 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.502734 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:49 crc kubenswrapper[4819]: I0228 03:36:49.523473 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:49Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.185196 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/0.log" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.185306 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerStarted","Data":"5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1"} Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.208006 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.226946 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.243575 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.288115 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.306852 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.325243 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.348540 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.367939 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.368053 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:50 crc kubenswrapper[4819]: E0228 03:36:50.368218 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.368495 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:50 crc kubenswrapper[4819]: E0228 03:36:50.368730 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:50 crc kubenswrapper[4819]: E0228 03:36:50.368834 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.380520 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.399730 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.423689 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.445209 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.462327 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.484275 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.501290 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.521243 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.535330 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:50 crc kubenswrapper[4819]: I0228 03:36:50.548079 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:50Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:51 crc kubenswrapper[4819]: I0228 03:36:51.368377 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:51 crc kubenswrapper[4819]: E0228 03:36:51.368577 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.304021 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.304084 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.304101 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.304128 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.304147 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:52Z","lastTransitionTime":"2026-02-28T03:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.328780 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.333928 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.333997 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.334016 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.334041 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.334060 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:52Z","lastTransitionTime":"2026-02-28T03:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.353700 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.358618 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.358685 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.358706 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.358732 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.358751 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:52Z","lastTransitionTime":"2026-02-28T03:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.368629 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.368829 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.369809 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.369921 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.370223 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.370353 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.381816 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.382109 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.387032 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.387093 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.387118 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.387144 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.387167 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:52Z","lastTransitionTime":"2026-02-28T03:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.393075 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.408189 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.414172 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.414229 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.414282 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.414315 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.414337 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:36:52Z","lastTransitionTime":"2026-02-28T03:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.414972 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.433033 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.437326 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.437565 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.449453 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.466940 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.482203 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.502054 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: E0228 03:36:52.508927 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.536813 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.556615 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.571348 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.584307 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.605996 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.621632 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.639859 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.662620 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.686526 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:52 crc kubenswrapper[4819]: I0228 03:36:52.706441 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:52Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:53 crc kubenswrapper[4819]: I0228 03:36:53.368750 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:53 crc kubenswrapper[4819]: E0228 03:36:53.369043 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:54 crc kubenswrapper[4819]: I0228 03:36:54.368507 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:54 crc kubenswrapper[4819]: I0228 03:36:54.368647 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:54 crc kubenswrapper[4819]: E0228 03:36:54.368682 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:54 crc kubenswrapper[4819]: E0228 03:36:54.368897 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:54 crc kubenswrapper[4819]: I0228 03:36:54.369040 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:54 crc kubenswrapper[4819]: E0228 03:36:54.369225 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:55 crc kubenswrapper[4819]: I0228 03:36:55.368198 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:55 crc kubenswrapper[4819]: E0228 03:36:55.368432 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:56 crc kubenswrapper[4819]: I0228 03:36:56.368114 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:56 crc kubenswrapper[4819]: I0228 03:36:56.368119 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:56 crc kubenswrapper[4819]: I0228 03:36:56.369415 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:56 crc kubenswrapper[4819]: E0228 03:36:56.369671 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:56 crc kubenswrapper[4819]: E0228 03:36:56.369825 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:56 crc kubenswrapper[4819]: E0228 03:36:56.369947 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:56 crc kubenswrapper[4819]: I0228 03:36:56.393946 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 28 03:36:57 crc kubenswrapper[4819]: I0228 03:36:57.606168 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:57 crc kubenswrapper[4819]: I0228 03:36:57.606276 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:57 crc kubenswrapper[4819]: E0228 03:36:57.606430 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:57 crc kubenswrapper[4819]: E0228 03:36:57.606756 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:57 crc kubenswrapper[4819]: E0228 03:36:57.607597 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:36:58 crc kubenswrapper[4819]: I0228 03:36:58.368063 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:36:58 crc kubenswrapper[4819]: I0228 03:36:58.368138 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:36:58 crc kubenswrapper[4819]: E0228 03:36:58.368726 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:36:58 crc kubenswrapper[4819]: E0228 03:36:58.368858 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:36:58 crc kubenswrapper[4819]: I0228 03:36:58.369125 4819 scope.go:117] "RemoveContainer" containerID="f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.221230 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/2.log" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.225460 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.226080 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.242778 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0844b6c-2e32-49c9-bffd-d01251524de5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd59238a97546f03c63a8a8c6ad93f642502a72bb7ce6282447988051d68581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 03:34:50.275504 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 03:34:50.276994 1 observer_polling.go:159] Starting file observer\\\\nI0228 03:34:50.279653 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 03:34:50.280626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 03:35:14.485093 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0228 03:35:19.691731 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 03:35:19.691872 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:50Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:35:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4333ea161996cd65ab3027eafa25efe039a1eaf4eae370bd93c5781ae44c00f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edad6d5aefa2b4525c99955b0f46074375ef146eac74482374c6563e137d2a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.256794 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.268062 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.282751 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.307011 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.331564 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03e11a0d-8058-4d8c-b781-bf2465061d11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f31b1e80c49f686cd4ba79a59efefe67ccb8b7ca7055a19bcf7338cfc97804bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57615eaa4744ed0df55d7c57709e96489e705b02cb6f0984eddf5999ffc730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf17e6bdcd3888d2a1bf313429c7fa99831233a4f38b71729844ec37dcb27c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa573d495e9a6c91589189b25a721deb399cc0cc50bd6581fa578607a7c93b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://931a5d438456f1c0cf96c2c26f70a55d60eb4557831639d66b47f068bdb2be31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.349483 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.365906 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.368239 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.368323 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:36:59 crc kubenswrapper[4819]: E0228 03:36:59.368454 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:36:59 crc kubenswrapper[4819]: E0228 03:36:59.368624 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.383454 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.398629 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.418589 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.434820 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.452996 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.474591 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.494235 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.512863 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.530090 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.547383 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:36:59 crc kubenswrapper[4819]: I0228 03:36:59.561414 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:36:59Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.232086 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/3.log" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.233146 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/2.log" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.237392 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" exitCode=1 Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.237448 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.237568 4819 scope.go:117] "RemoveContainer" containerID="f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.238526 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:37:00 crc kubenswrapper[4819]: E0228 03:37:00.238791 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.259687 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.277955 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.300805 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.318181 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.350757 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03e11a0d-8058-4d8c-b781-bf2465061d11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f31b1e80c49f686cd4ba79a59efefe67ccb8b7ca7055a19bcf7338cfc97804bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57615eaa4744ed0df55d7c57709e96489e705b02cb6f0984eddf5999ffc730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf17e6bdcd3888d2a1bf313429c7fa99831233a4f38b71729844ec37dcb27c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa573d495e9a6c91589189b25a721deb399cc0cc50bd6581fa578607a7c93b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://931a5d438456f1c0cf96c2c26f70a55d60eb4557831639d66b47f068bdb2be31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.366175 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.368457 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.368487 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:00 crc kubenswrapper[4819]: E0228 03:37:00.368617 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:00 crc kubenswrapper[4819]: E0228 03:37:00.368706 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.386092 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.406918 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.424662 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.445690 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.465333 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.488211 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.507381 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.524097 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.543761 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.575329 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f729af9f79b51fdceba8392f8417070eca551fc1cdc41c6d4614ee7e47a3627e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:30Z\\\",\\\"message\\\":\\\"592 7022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.345636 7022 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0228 03:36:30.345651 7022 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0228 03:36:30.345687 7022 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0228 03:36:30.345704 7022 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0228 03:36:30.345641 7022 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0228 03:36:30.345901 7022 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0228 03:36:30.346057 7022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0228 03:36:30.346137 7022 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346571 7022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0228 03:36:30.346837 7022 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:59Z\\\",\\\"message\\\":\\\"ble:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:59.485847 7356 obj_retry.go:551] Creating *factory.egressNode crc took: 3.249921ms\\\\nI0228 03:36:59.485928 7356 factory.go:1336] Added *v1.Node event handler 7\\\\nI0228 03:36:59.486031 7356 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0228 03:36:59.486049 7356 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:59.486117 7356 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:59.486192 7356 factory.go:656] Stopping watch factory\\\\nI0228 03:36:59.486232 7356 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:59.486282 7356 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03:36:59.486598 7356 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0228 03:36:59.486741 7356 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0228 03:36:59.486838 7356 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:59.486936 7356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 03:36:59.487074 7356 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.594497 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0844b6c-2e32-49c9-bffd-d01251524de5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd59238a97546f03c63a8a8c6ad93f642502a72bb7ce6282447988051d68581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 03:34:50.275504 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 03:34:50.276994 1 observer_polling.go:159] Starting file observer\\\\nI0228 03:34:50.279653 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 03:34:50.280626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 03:35:14.485093 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0228 03:35:19.691731 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 03:35:19.691872 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:50Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:35:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4333ea161996cd65ab3027eafa25efe039a1eaf4eae370bd93c5781ae44c00f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edad6d5aefa2b4525c99955b0f46074375ef146eac74482374c6563e137d2a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.611708 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:00 crc kubenswrapper[4819]: I0228 03:37:00.627411 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:00Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.242857 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:37:01 crc kubenswrapper[4819]: E0228 03:37:01.243214 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.263587 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.282452 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.303591 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.323935 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.340830 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.357043 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.368570 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.368596 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:01 crc kubenswrapper[4819]: E0228 03:37:01.368804 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:01 crc kubenswrapper[4819]: E0228 03:37:01.368975 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.379156 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.397513 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.413806 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.432808 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.465611 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:59Z\\\",\\\"message\\\":\\\"ble:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:59.485847 7356 obj_retry.go:551] Creating *factory.egressNode crc took: 3.249921ms\\\\nI0228 03:36:59.485928 7356 factory.go:1336] Added *v1.Node event handler 7\\\\nI0228 03:36:59.486031 7356 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0228 03:36:59.486049 7356 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:59.486117 7356 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:59.486192 7356 factory.go:656] Stopping watch factory\\\\nI0228 03:36:59.486232 7356 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:59.486282 7356 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03:36:59.486598 7356 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0228 03:36:59.486741 7356 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0228 03:36:59.486838 7356 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:59.486936 7356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 03:36:59.487074 7356 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.486947 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0844b6c-2e32-49c9-bffd-d01251524de5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd59238a97546f03c63a8a8c6ad93f642502a72bb7ce6282447988051d68581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 03:34:50.275504 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 03:34:50.276994 1 observer_polling.go:159] Starting file observer\\\\nI0228 03:34:50.279653 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 03:34:50.280626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 03:35:14.485093 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0228 03:35:19.691731 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 03:35:19.691872 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:50Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:35:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4333ea161996cd65ab3027eafa25efe039a1eaf4eae370bd93c5781ae44c00f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edad6d5aefa2b4525c99955b0f46074375ef146eac74482374c6563e137d2a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.503752 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.525323 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.541057 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.560447 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.584441 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.601730 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:01 crc kubenswrapper[4819]: I0228 03:37:01.633068 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03e11a0d-8058-4d8c-b781-bf2465061d11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f31b1e80c49f686cd4ba79a59efefe67ccb8b7ca7055a19bcf7338cfc97804bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57615eaa4744ed0df55d7c57709e96489e705b02cb6f0984eddf5999ffc730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf17e6bdcd3888d2a1bf313429c7fa99831233a4f38b71729844ec37dcb27c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa573d495e9a6c91589189b25a721deb399cc0cc50bd6581fa578607a7c93b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://931a5d438456f1c0cf96c2c26f70a55d60eb4557831639d66b47f068bdb2be31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:01Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.246861 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/3.log" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.368474 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.368563 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.368661 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.368752 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.390411 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0844b6c-2e32-49c9-bffd-d01251524de5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd59238a97546f03c63a8a8c6ad93f642502a72bb7ce6282447988051d68581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 03:34:50.275504 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 03:34:50.276994 1 observer_polling.go:159] Starting file observer\\\\nI0228 03:34:50.279653 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 03:34:50.280626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 03:35:14.485093 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0228 03:35:19.691731 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 03:35:19.691872 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:50Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:35:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4333ea161996cd65ab3027eafa25efe039a1eaf4eae370bd93c5781ae44c00f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edad6d5aefa2b4525c99955b0f46074375ef146eac74482374c6563e137d2a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.408951 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.424609 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.443987 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.482332 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:59Z\\\",\\\"message\\\":\\\"ble:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:59.485847 7356 obj_retry.go:551] Creating *factory.egressNode crc took: 3.249921ms\\\\nI0228 03:36:59.485928 7356 factory.go:1336] Added *v1.Node event handler 7\\\\nI0228 03:36:59.486031 7356 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0228 03:36:59.486049 7356 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:59.486117 7356 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:59.486192 7356 factory.go:656] Stopping watch factory\\\\nI0228 03:36:59.486232 7356 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:59.486282 7356 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03:36:59.486598 7356 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0228 03:36:59.486741 7356 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0228 03:36:59.486838 7356 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:59.486936 7356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 03:36:59.487074 7356 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.483573 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.483641 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.483658 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.483684 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.483702 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:02Z","lastTransitionTime":"2026-02-28T03:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.507370 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.511457 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.516234 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.516330 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.516349 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.516374 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.516391 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:02Z","lastTransitionTime":"2026-02-28T03:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.536791 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.539586 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03e11a0d-8058-4d8c-b781-bf2465061d11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f31b1e80c49f686cd4ba79a59efefe67ccb8b7ca7055a19bcf7338cfc97804bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57615eaa4744ed0df55d7c57709e96489e705b02cb6f0984eddf5999ffc730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf17e6bdcd3888d2a1bf313429c7fa99831233a4f38b71729844ec37dcb27c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa573d495e9a6c91589189b25a721deb399cc0cc50bd6581fa578607a7c93b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://931a5d438456f1c0cf96c2c26f70a55d60eb4557831639d66b47f068bdb2be31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.542155 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.542330 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.542370 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.542404 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.542430 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:02Z","lastTransitionTime":"2026-02-28T03:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.558420 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.563673 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.567943 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.568003 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.568027 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.568055 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.568081 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:02Z","lastTransitionTime":"2026-02-28T03:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.579511 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.587532 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.592150 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.592210 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.592228 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.592280 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.592300 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:02Z","lastTransitionTime":"2026-02-28T03:37:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.598311 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.608542 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.614786 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: E0228 03:37:02.615001 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.619023 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.638922 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.651220 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.666757 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.683748 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.704916 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.722525 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.739291 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:02 crc kubenswrapper[4819]: I0228 03:37:02.753567 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:02Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:03 crc kubenswrapper[4819]: I0228 03:37:03.368386 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:03 crc kubenswrapper[4819]: I0228 03:37:03.368398 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:03 crc kubenswrapper[4819]: E0228 03:37:03.368603 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:03 crc kubenswrapper[4819]: E0228 03:37:03.368759 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.291185 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.291413 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.291370299 +0000 UTC m=+226.756939197 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.291549 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.291717 4819 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.291784 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.291798 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.291774859 +0000 UTC m=+226.757343747 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.291865 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.292075 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.292101 4819 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.292113 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.292140 4819 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.292208 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.292182999 +0000 UTC m=+226.757751897 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.292243 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.2922274 +0000 UTC m=+226.757796378 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.368628 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.368695 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.368866 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.369002 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:04 crc kubenswrapper[4819]: I0228 03:37:04.392535 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.392770 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.392803 4819 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.392824 4819 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:37:04 crc kubenswrapper[4819]: E0228 03:37:04.392901 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.392879311 +0000 UTC m=+226.858448209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 28 03:37:05 crc kubenswrapper[4819]: I0228 03:37:05.368923 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:05 crc kubenswrapper[4819]: I0228 03:37:05.368944 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:05 crc kubenswrapper[4819]: E0228 03:37:05.369532 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:05 crc kubenswrapper[4819]: E0228 03:37:05.369817 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:06 crc kubenswrapper[4819]: I0228 03:37:06.368700 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:06 crc kubenswrapper[4819]: I0228 03:37:06.368771 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:06 crc kubenswrapper[4819]: E0228 03:37:06.368920 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:06 crc kubenswrapper[4819]: E0228 03:37:06.369033 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:07 crc kubenswrapper[4819]: I0228 03:37:07.368456 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:07 crc kubenswrapper[4819]: I0228 03:37:07.368506 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:07 crc kubenswrapper[4819]: E0228 03:37:07.368695 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:07 crc kubenswrapper[4819]: E0228 03:37:07.368838 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:07 crc kubenswrapper[4819]: E0228 03:37:07.610117 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:08 crc kubenswrapper[4819]: I0228 03:37:08.368543 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:08 crc kubenswrapper[4819]: I0228 03:37:08.368663 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:08 crc kubenswrapper[4819]: E0228 03:37:08.368751 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:08 crc kubenswrapper[4819]: E0228 03:37:08.368896 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:09 crc kubenswrapper[4819]: I0228 03:37:09.368156 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:09 crc kubenswrapper[4819]: I0228 03:37:09.368172 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:09 crc kubenswrapper[4819]: E0228 03:37:09.368390 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:09 crc kubenswrapper[4819]: E0228 03:37:09.368557 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:10 crc kubenswrapper[4819]: I0228 03:37:10.368349 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:10 crc kubenswrapper[4819]: I0228 03:37:10.368424 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:10 crc kubenswrapper[4819]: E0228 03:37:10.368583 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:10 crc kubenswrapper[4819]: E0228 03:37:10.368678 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:11 crc kubenswrapper[4819]: I0228 03:37:11.368784 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:11 crc kubenswrapper[4819]: I0228 03:37:11.368855 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:11 crc kubenswrapper[4819]: E0228 03:37:11.368991 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:11 crc kubenswrapper[4819]: E0228 03:37:11.369123 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.369019 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.369158 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.369236 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.369432 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.386050 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82ldl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:11Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lbrtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.405993 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5ldpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78f6484e-91d1-4345-baad-9f39f49a3915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:48Z\\\",\\\"message\\\":\\\"2026-02-28T03:36:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b\\\\n2026-02-28T03:36:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_316b9a18-55d0-4515-aef0-29fb7347fb0b to /host/opt/cni/bin/\\\\n2026-02-28T03:36:03Z [verbose] multus-daemon started\\\\n2026-02-28T03:36:03Z [verbose] Readiness Indicator file check\\\\n2026-02-28T03:36:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v76t9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5ldpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.425070 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.441797 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6ad11c1-0eb7-4064-bb39-3ffb389efb90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edf9dfc032ef80851e9959ffbf1dca8c7dd89ea729cff6896690a222f67f93eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88mgf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rw4hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.458061 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q5btw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f29dfe8-c6ab-429e-8ed5-3ca9be724486\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dece7685a60f62807fdae3d4b4ef7600dbf4051bd31f7c66006defa5c8f2f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwbsx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q5btw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.478261 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"036831af-028e-4d9b-913c-15cb3632eb9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0228 03:35:25.044439 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0228 03:35:25.044553 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0228 03:35:25.045220 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3545176149/tls.crt::/tmp/serving-cert-3545176149/tls.key\\\\\\\"\\\\nI0228 03:35:25.478892 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0228 03:35:25.484295 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0228 03:35:25.484324 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0228 03:35:25.484344 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0228 03:35:25.484348 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0228 03:35:25.491599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0228 03:35:25.491636 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491641 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0228 03:35:25.491645 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0228 03:35:25.491649 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0228 03:35:25.491652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0228 03:35:25.491655 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0228 03:35:25.491853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0228 03:35:25.493650 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:35:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.497976 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.517088 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-krp5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebdb39d5-8593-4a70-a0cd-c4701f9e58da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa473468fc442b84fe3ae6293d140819b0b6b3828c400aa1b74a4feb0e4ad5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgl7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-krp5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.538624 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fe18a003be1bb5fe7ee93b6781faee1633445efa3269328a4120af1bce118a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://152e9c078e8f6766f8039941b0a2fe6716add5a4d4a07a955c96b54602c5ee56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.571991 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caffcb28-383d-4424-a641-7dd1f36080c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-28T03:36:59Z\\\",\\\"message\\\":\\\"ble:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0228 03:36:59.485847 7356 obj_retry.go:551] Creating *factory.egressNode crc took: 3.249921ms\\\\nI0228 03:36:59.485928 7356 factory.go:1336] Added *v1.Node event handler 7\\\\nI0228 03:36:59.486031 7356 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0228 03:36:59.486049 7356 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0228 03:36:59.486117 7356 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0228 03:36:59.486192 7356 factory.go:656] Stopping watch factory\\\\nI0228 03:36:59.486232 7356 handler.go:208] Removed *v1.Node event handler 7\\\\nI0228 03:36:59.486282 7356 handler.go:208] Removed *v1.Node event handler 2\\\\nI0228 03:36:59.486598 7356 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0228 03:36:59.486741 7356 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0228 03:36:59.486838 7356 ovnkube.go:599] Stopped ovnkube\\\\nI0228 03:36:59.486936 7356 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0228 03:36:59.487074 7356 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9lwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njv8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.592753 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0844b6c-2e32-49c9-bffd-d01251524de5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd59238a97546f03c63a8a8c6ad93f642502a72bb7ce6282447988051d68581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11d618fcb5b6c1ff0dbd86f992db3aaef7fbbf5b535e4a52eddb718615156150\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-28T03:35:20Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0228 03:34:50.275504 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0228 03:34:50.276994 1 observer_polling.go:159] Starting file observer\\\\nI0228 03:34:50.279653 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0228 03:34:50.280626 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0228 03:35:14.485093 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0228 03:35:19.691731 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0228 03:35:19.691872 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:50Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:35:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4333ea161996cd65ab3027eafa25efe039a1eaf4eae370bd93c5781ae44c00f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4edad6d5aefa2b4525c99955b0f46074375ef146eac74482374c6563e137d2a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.611370 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.611597 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ec529d5-e315-455d-8481-a3f3a71cad3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:35:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e79c09cb2dd74f17c57f56a55f863984633737cdae53673b22d0f70fb7d1495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd18a54e72f50ca30875e73941b4b4a748949baf3e2e39fd74bac94a32054a70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfa00ea11af9a3f84da769b8eee5b2df34b57e08d669ec681f66632c2fb8db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74932b671b92cffe088a6af1dc91f839914151644995787d0cc78d0ac37dcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.634462 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d9d13538e5326889d059bb861df676e49c858e3b4c7bef15af42225133a291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.654568 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.671789 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb78c220fbb9c383d8159fee34d35e3cc22dc618065aff540dc673143d58fb7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.693788 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47759ba-9f0e-4aba-b3cf-dc4142c02f41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19c0c2bbd6d6c4b26013b44500f49d01acbccbf96c3b30f49d56e6a86cdcc557\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e27a2ac0268a7a209cc3797c19334f80048472da649130bd70275acf0a320a3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2e7a1ec756144bdefa8424e22eca8e92e5adf38292f4bf068ed4139bc606fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd5a3936e8211ac0dd933edbd99e54ad1c8228f727b7e08878c13f6a9c50a92c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cc3e878d9f59395df08ef93394ce4aff09e6cfbc1eddf0d657ffa4526b6f47c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bf1d21c1079ee69a212b03feb0e2f1991ee712fb703356b3b12bd1cd5111a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://180836ac698d132d555db191aa022e7212d90c6a26a7ffcc73234fed4daa0926\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:36:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbjp4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8c5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.710159 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad04ba9d-eb5c-422e-bf1b-8f6dee9399d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:36:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd04b6ff98e7249d065713e8f98aac996a8cdbf52754b022df6f41638a8698e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5550bc658d20373ada936202e1b2942feba7f195baa5c31cf0c238fd43c4bb31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:36:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjjmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:36:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8nk8p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.741737 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03e11a0d-8058-4d8c-b781-bf2465061d11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f31b1e80c49f686cd4ba79a59efefe67ccb8b7ca7055a19bcf7338cfc97804bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f57615eaa4744ed0df55d7c57709e96489e705b02cb6f0984eddf5999ffc730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf17e6bdcd3888d2a1bf313429c7fa99831233a4f38b71729844ec37dcb27c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa573d495e9a6c91589189b25a721deb399cc0cc50bd6581fa578607a7c93b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://931a5d438456f1c0cf96c2c26f70a55d60eb4557831639d66b47f068bdb2be31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64625d2ad1d8ac46b64c62293975a67cb3a7173d8ce28b44bd3022321e69188d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec880415b4d5dda85bb0387153c86555aaee9b0d1ff3db2842e2037dc372a0ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b41013dae8088f9ad14b99a6c94c3fd35c4d985ef10b874a2d78fcf30a16e05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.756209 4819 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d901253d-ecd5-4e6a-9bfa-7669c576616c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-28T03:34:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://079bd53ac5364aea9c07f32f8321259470df894518e6c95a0d5711fe6f36ce32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-28T03:34:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b17a878f9dfd20eb895cf423dfe25ef545c9cf0bd85c1c0bf753465a75a89ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-28T03:34:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-28T03:34:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-28T03:34:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.883305 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.883360 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.883377 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.883400 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.883417 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:12Z","lastTransitionTime":"2026-02-28T03:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.903749 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.908547 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.908597 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.908613 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.908637 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.908680 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:12Z","lastTransitionTime":"2026-02-28T03:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.929082 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.933996 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.934042 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.934060 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.934084 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.934103 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:12Z","lastTransitionTime":"2026-02-28T03:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.959411 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.964586 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.964646 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.964690 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.964714 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.964734 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:12Z","lastTransitionTime":"2026-02-28T03:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:12 crc kubenswrapper[4819]: E0228 03:37:12.985540 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:12Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.990142 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.990405 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.990604 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.990789 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:12 crc kubenswrapper[4819]: I0228 03:37:12.990938 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:12Z","lastTransitionTime":"2026-02-28T03:37:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:13 crc kubenswrapper[4819]: E0228 03:37:13.009438 4819 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-28T03:37:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81735f3f-b725-4ecc-bf66-34a29471cd39\\\",\\\"systemUUID\\\":\\\"93f3ff1a-d0a3-46b4-b86c-112127fcdcca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-28T03:37:13Z is after 2025-08-24T17:21:41Z" Feb 28 03:37:13 crc kubenswrapper[4819]: E0228 03:37:13.009663 4819 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 28 03:37:13 crc kubenswrapper[4819]: I0228 03:37:13.368291 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:13 crc kubenswrapper[4819]: E0228 03:37:13.368497 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:13 crc kubenswrapper[4819]: I0228 03:37:13.368314 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:13 crc kubenswrapper[4819]: E0228 03:37:13.369041 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:14 crc kubenswrapper[4819]: I0228 03:37:14.368138 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:14 crc kubenswrapper[4819]: I0228 03:37:14.368169 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:14 crc kubenswrapper[4819]: E0228 03:37:14.369876 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:14 crc kubenswrapper[4819]: E0228 03:37:14.370037 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:15 crc kubenswrapper[4819]: I0228 03:37:15.368526 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:15 crc kubenswrapper[4819]: I0228 03:37:15.368531 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:15 crc kubenswrapper[4819]: E0228 03:37:15.368700 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:15 crc kubenswrapper[4819]: E0228 03:37:15.368809 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:15 crc kubenswrapper[4819]: I0228 03:37:15.509177 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:15 crc kubenswrapper[4819]: E0228 03:37:15.509435 4819 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:37:15 crc kubenswrapper[4819]: E0228 03:37:15.509604 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs podName:e7eede0c-6dc0-48ac-8065-7e0d9ed91212 nodeName:}" failed. No retries permitted until 2026-02-28 03:38:19.50957474 +0000 UTC m=+237.975143628 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs") pod "network-metrics-daemon-lbrtr" (UID: "e7eede0c-6dc0-48ac-8065-7e0d9ed91212") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 28 03:37:16 crc kubenswrapper[4819]: I0228 03:37:16.368609 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:16 crc kubenswrapper[4819]: E0228 03:37:16.368807 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:16 crc kubenswrapper[4819]: I0228 03:37:16.368844 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:16 crc kubenswrapper[4819]: E0228 03:37:16.369527 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:16 crc kubenswrapper[4819]: I0228 03:37:16.370098 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:37:16 crc kubenswrapper[4819]: E0228 03:37:16.370520 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:37:17 crc kubenswrapper[4819]: I0228 03:37:17.368757 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:17 crc kubenswrapper[4819]: I0228 03:37:17.368798 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:17 crc kubenswrapper[4819]: E0228 03:37:17.368949 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:17 crc kubenswrapper[4819]: E0228 03:37:17.369091 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:17 crc kubenswrapper[4819]: E0228 03:37:17.612925 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:18 crc kubenswrapper[4819]: I0228 03:37:18.368139 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:18 crc kubenswrapper[4819]: I0228 03:37:18.368229 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:18 crc kubenswrapper[4819]: E0228 03:37:18.368352 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:18 crc kubenswrapper[4819]: E0228 03:37:18.368562 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:19 crc kubenswrapper[4819]: I0228 03:37:19.368738 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:19 crc kubenswrapper[4819]: I0228 03:37:19.368833 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:19 crc kubenswrapper[4819]: E0228 03:37:19.368870 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:19 crc kubenswrapper[4819]: E0228 03:37:19.369425 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:20 crc kubenswrapper[4819]: I0228 03:37:20.368186 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:20 crc kubenswrapper[4819]: E0228 03:37:20.368410 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:20 crc kubenswrapper[4819]: I0228 03:37:20.368519 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:20 crc kubenswrapper[4819]: E0228 03:37:20.368722 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:21 crc kubenswrapper[4819]: I0228 03:37:21.368757 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:21 crc kubenswrapper[4819]: I0228 03:37:21.368809 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:21 crc kubenswrapper[4819]: E0228 03:37:21.368971 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:21 crc kubenswrapper[4819]: E0228 03:37:21.369652 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.368015 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.368327 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:22 crc kubenswrapper[4819]: E0228 03:37:22.368431 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:22 crc kubenswrapper[4819]: E0228 03:37:22.368491 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.411598 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.411528422 podStartE2EDuration="1m20.411528422s" podCreationTimestamp="2026-02-28 03:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.410025434 +0000 UTC m=+180.875594352" watchObservedRunningTime="2026-02-28 03:37:22.411528422 +0000 UTC m=+180.877097330" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.451519 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podStartSLOduration=122.451485399 podStartE2EDuration="2m2.451485399s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.451305684 +0000 UTC m=+180.916874582" watchObservedRunningTime="2026-02-28 03:37:22.451485399 +0000 UTC m=+180.917054297" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.508063 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q5btw" podStartSLOduration=122.50804174 podStartE2EDuration="2m2.50804174s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.468926814 +0000 UTC m=+180.934495702" watchObservedRunningTime="2026-02-28 03:37:22.50804174 +0000 UTC m=+180.973610638" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.560814 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.560777226 podStartE2EDuration="48.560777226s" podCreationTimestamp="2026-02-28 03:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.560483078 +0000 UTC m=+181.026051976" watchObservedRunningTime="2026-02-28 03:37:22.560777226 +0000 UTC m=+181.026346144" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.561334 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=30.56132529 podStartE2EDuration="30.56132529s" podCreationTimestamp="2026-02-28 03:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.539607798 +0000 UTC m=+181.005176686" watchObservedRunningTime="2026-02-28 03:37:22.56132529 +0000 UTC m=+181.026894178" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.577847 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-krp5h" podStartSLOduration=122.577824511 podStartE2EDuration="2m2.577824511s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.577622686 +0000 UTC m=+181.043191584" watchObservedRunningTime="2026-02-28 03:37:22.577824511 +0000 UTC m=+181.043393399" Feb 28 03:37:22 crc kubenswrapper[4819]: E0228 03:37:22.614599 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.651353 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b8c5l" podStartSLOduration=122.651332915 podStartE2EDuration="2m2.651332915s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.64993413 +0000 UTC m=+181.115503028" watchObservedRunningTime="2026-02-28 03:37:22.651332915 +0000 UTC m=+181.116901803" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.708121 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8nk8p" podStartSLOduration=121.708097342 podStartE2EDuration="2m1.708097342s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.672470863 +0000 UTC m=+181.138039761" watchObservedRunningTime="2026-02-28 03:37:22.708097342 +0000 UTC m=+181.173666240" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.721051 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=26.721027984 podStartE2EDuration="26.721027984s" podCreationTimestamp="2026-02-28 03:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.70922421 +0000 UTC m=+181.174793098" watchObservedRunningTime="2026-02-28 03:37:22.721027984 +0000 UTC m=+181.186596882" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.744514 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=37.74448264 podStartE2EDuration="37.74448264s" podCreationTimestamp="2026-02-28 03:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.722990713 +0000 UTC m=+181.188559621" watchObservedRunningTime="2026-02-28 03:37:22.74448264 +0000 UTC m=+181.210051538" Feb 28 03:37:22 crc kubenswrapper[4819]: I0228 03:37:22.784217 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5ldpg" podStartSLOduration=122.784198031 podStartE2EDuration="2m2.784198031s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:22.782883578 +0000 UTC m=+181.248452476" watchObservedRunningTime="2026-02-28 03:37:22.784198031 +0000 UTC m=+181.249766899" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.320407 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.320474 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.320492 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.320517 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.320535 4819 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-28T03:37:23Z","lastTransitionTime":"2026-02-28T03:37:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.368860 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.368862 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:23 crc kubenswrapper[4819]: E0228 03:37:23.369089 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:23 crc kubenswrapper[4819]: E0228 03:37:23.369404 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.383208 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4"] Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.383918 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.387339 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.387681 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.387924 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.388070 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.499793 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6bad408c-8f94-4b04-91d9-7339061adaa3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.499852 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6bad408c-8f94-4b04-91d9-7339061adaa3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.499897 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bad408c-8f94-4b04-91d9-7339061adaa3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.499931 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bad408c-8f94-4b04-91d9-7339061adaa3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.499970 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bad408c-8f94-4b04-91d9-7339061adaa3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600641 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6bad408c-8f94-4b04-91d9-7339061adaa3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600706 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6bad408c-8f94-4b04-91d9-7339061adaa3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600742 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bad408c-8f94-4b04-91d9-7339061adaa3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600779 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bad408c-8f94-4b04-91d9-7339061adaa3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600846 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bad408c-8f94-4b04-91d9-7339061adaa3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600872 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6bad408c-8f94-4b04-91d9-7339061adaa3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.600880 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6bad408c-8f94-4b04-91d9-7339061adaa3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.602484 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bad408c-8f94-4b04-91d9-7339061adaa3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.607043 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bad408c-8f94-4b04-91d9-7339061adaa3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.616347 4819 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.626531 4819 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.630231 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bad408c-8f94-4b04-91d9-7339061adaa3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbmx4\" (UID: \"6bad408c-8f94-4b04-91d9-7339061adaa3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: I0228 03:37:23.705528 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" Feb 28 03:37:23 crc kubenswrapper[4819]: W0228 03:37:23.729773 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bad408c_8f94_4b04_91d9_7339061adaa3.slice/crio-55438bc49bcdae0f102890709f5542dad2a719ca393c9e5f5b3cc3ac0b8cb5c3 WatchSource:0}: Error finding container 55438bc49bcdae0f102890709f5542dad2a719ca393c9e5f5b3cc3ac0b8cb5c3: Status 404 returned error can't find the container with id 55438bc49bcdae0f102890709f5542dad2a719ca393c9e5f5b3cc3ac0b8cb5c3 Feb 28 03:37:24 crc kubenswrapper[4819]: I0228 03:37:24.331482 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" event={"ID":"6bad408c-8f94-4b04-91d9-7339061adaa3","Type":"ContainerStarted","Data":"7cc2695477bddf9ebeae6863a2b78e417c18eb6ccf1abe1b17eb59a86ff5b436"} Feb 28 03:37:24 crc kubenswrapper[4819]: I0228 03:37:24.331571 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" event={"ID":"6bad408c-8f94-4b04-91d9-7339061adaa3","Type":"ContainerStarted","Data":"55438bc49bcdae0f102890709f5542dad2a719ca393c9e5f5b3cc3ac0b8cb5c3"} Feb 28 03:37:24 crc kubenswrapper[4819]: I0228 03:37:24.352852 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbmx4" podStartSLOduration=124.352828782 podStartE2EDuration="2m4.352828782s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:24.35153863 +0000 UTC m=+182.817107528" watchObservedRunningTime="2026-02-28 03:37:24.352828782 +0000 UTC m=+182.818397680" Feb 28 03:37:24 crc kubenswrapper[4819]: I0228 03:37:24.367917 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:24 crc kubenswrapper[4819]: I0228 03:37:24.367929 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:24 crc kubenswrapper[4819]: E0228 03:37:24.368151 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:24 crc kubenswrapper[4819]: E0228 03:37:24.368291 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:25 crc kubenswrapper[4819]: I0228 03:37:25.368828 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:25 crc kubenswrapper[4819]: I0228 03:37:25.368844 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:25 crc kubenswrapper[4819]: E0228 03:37:25.369018 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:25 crc kubenswrapper[4819]: E0228 03:37:25.369191 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:26 crc kubenswrapper[4819]: I0228 03:37:26.368923 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:26 crc kubenswrapper[4819]: I0228 03:37:26.369034 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:26 crc kubenswrapper[4819]: E0228 03:37:26.369108 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:26 crc kubenswrapper[4819]: E0228 03:37:26.369278 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:27 crc kubenswrapper[4819]: I0228 03:37:27.368870 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:27 crc kubenswrapper[4819]: I0228 03:37:27.368891 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:27 crc kubenswrapper[4819]: E0228 03:37:27.369076 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:27 crc kubenswrapper[4819]: E0228 03:37:27.369200 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:27 crc kubenswrapper[4819]: E0228 03:37:27.615689 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:28 crc kubenswrapper[4819]: I0228 03:37:28.368420 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:28 crc kubenswrapper[4819]: I0228 03:37:28.368500 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:28 crc kubenswrapper[4819]: E0228 03:37:28.368609 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:28 crc kubenswrapper[4819]: E0228 03:37:28.368725 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:29 crc kubenswrapper[4819]: I0228 03:37:29.368948 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:29 crc kubenswrapper[4819]: I0228 03:37:29.368963 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:29 crc kubenswrapper[4819]: E0228 03:37:29.369215 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:29 crc kubenswrapper[4819]: E0228 03:37:29.369349 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:30 crc kubenswrapper[4819]: I0228 03:37:30.368579 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:30 crc kubenswrapper[4819]: I0228 03:37:30.368719 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:30 crc kubenswrapper[4819]: E0228 03:37:30.368896 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:30 crc kubenswrapper[4819]: E0228 03:37:30.370085 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:30 crc kubenswrapper[4819]: I0228 03:37:30.370312 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:37:30 crc kubenswrapper[4819]: E0228 03:37:30.370581 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njv8f_openshift-ovn-kubernetes(caffcb28-383d-4424-a641-7dd1f36080c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" Feb 28 03:37:31 crc kubenswrapper[4819]: I0228 03:37:31.368888 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:31 crc kubenswrapper[4819]: I0228 03:37:31.368909 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:31 crc kubenswrapper[4819]: E0228 03:37:31.369648 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:31 crc kubenswrapper[4819]: E0228 03:37:31.369463 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:32 crc kubenswrapper[4819]: I0228 03:37:32.368566 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:32 crc kubenswrapper[4819]: I0228 03:37:32.368571 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:32 crc kubenswrapper[4819]: E0228 03:37:32.375172 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:32 crc kubenswrapper[4819]: E0228 03:37:32.375342 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:32 crc kubenswrapper[4819]: E0228 03:37:32.617156 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:33 crc kubenswrapper[4819]: I0228 03:37:33.368037 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:33 crc kubenswrapper[4819]: I0228 03:37:33.368037 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:33 crc kubenswrapper[4819]: E0228 03:37:33.368188 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:33 crc kubenswrapper[4819]: E0228 03:37:33.368373 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:34 crc kubenswrapper[4819]: I0228 03:37:34.368063 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:34 crc kubenswrapper[4819]: E0228 03:37:34.368207 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:34 crc kubenswrapper[4819]: I0228 03:37:34.368335 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:34 crc kubenswrapper[4819]: E0228 03:37:34.368638 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.367661 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/1.log" Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.367903 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:35 crc kubenswrapper[4819]: E0228 03:37:35.368025 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.368031 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:35 crc kubenswrapper[4819]: E0228 03:37:35.368127 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.368712 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/0.log" Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.368781 4819 generic.go:334] "Generic (PLEG): container finished" podID="78f6484e-91d1-4345-baad-9f39f49a3915" containerID="5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1" exitCode=1 Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.368817 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerDied","Data":"5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1"} Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.368875 4819 scope.go:117] "RemoveContainer" containerID="3d319bbc787293f195505480fddd0282f9c61166d94b213c06735333c0bb7d25" Feb 28 03:37:35 crc kubenswrapper[4819]: I0228 03:37:35.369492 4819 scope.go:117] "RemoveContainer" containerID="5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1" Feb 28 03:37:35 crc kubenswrapper[4819]: E0228 03:37:35.369752 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5ldpg_openshift-multus(78f6484e-91d1-4345-baad-9f39f49a3915)\"" pod="openshift-multus/multus-5ldpg" podUID="78f6484e-91d1-4345-baad-9f39f49a3915" Feb 28 03:37:36 crc kubenswrapper[4819]: I0228 03:37:36.368500 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:36 crc kubenswrapper[4819]: E0228 03:37:36.368712 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:36 crc kubenswrapper[4819]: I0228 03:37:36.368806 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:36 crc kubenswrapper[4819]: E0228 03:37:36.368949 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:36 crc kubenswrapper[4819]: I0228 03:37:36.373877 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/1.log" Feb 28 03:37:37 crc kubenswrapper[4819]: I0228 03:37:37.368204 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:37 crc kubenswrapper[4819]: I0228 03:37:37.368211 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:37 crc kubenswrapper[4819]: E0228 03:37:37.368467 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:37 crc kubenswrapper[4819]: E0228 03:37:37.368608 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:37 crc kubenswrapper[4819]: E0228 03:37:37.618737 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:38 crc kubenswrapper[4819]: I0228 03:37:38.368899 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:38 crc kubenswrapper[4819]: I0228 03:37:38.368941 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:38 crc kubenswrapper[4819]: E0228 03:37:38.369090 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:38 crc kubenswrapper[4819]: E0228 03:37:38.369178 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:39 crc kubenswrapper[4819]: I0228 03:37:39.367870 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:39 crc kubenswrapper[4819]: I0228 03:37:39.367894 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:39 crc kubenswrapper[4819]: E0228 03:37:39.368001 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:39 crc kubenswrapper[4819]: E0228 03:37:39.368217 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:40 crc kubenswrapper[4819]: I0228 03:37:40.368462 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:40 crc kubenswrapper[4819]: I0228 03:37:40.368528 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:40 crc kubenswrapper[4819]: E0228 03:37:40.369464 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:40 crc kubenswrapper[4819]: E0228 03:37:40.369640 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:41 crc kubenswrapper[4819]: I0228 03:37:41.368461 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:41 crc kubenswrapper[4819]: I0228 03:37:41.368503 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:41 crc kubenswrapper[4819]: E0228 03:37:41.368660 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:41 crc kubenswrapper[4819]: E0228 03:37:41.368766 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:42 crc kubenswrapper[4819]: I0228 03:37:42.368809 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:42 crc kubenswrapper[4819]: I0228 03:37:42.368861 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:42 crc kubenswrapper[4819]: E0228 03:37:42.370635 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:42 crc kubenswrapper[4819]: E0228 03:37:42.370718 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:42 crc kubenswrapper[4819]: E0228 03:37:42.619613 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:43 crc kubenswrapper[4819]: I0228 03:37:43.368462 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:43 crc kubenswrapper[4819]: I0228 03:37:43.368507 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:43 crc kubenswrapper[4819]: E0228 03:37:43.368999 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:43 crc kubenswrapper[4819]: E0228 03:37:43.369217 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:44 crc kubenswrapper[4819]: I0228 03:37:44.367926 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:44 crc kubenswrapper[4819]: I0228 03:37:44.367997 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:44 crc kubenswrapper[4819]: E0228 03:37:44.368096 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:44 crc kubenswrapper[4819]: E0228 03:37:44.368424 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:45 crc kubenswrapper[4819]: I0228 03:37:45.368905 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:45 crc kubenswrapper[4819]: I0228 03:37:45.368914 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:45 crc kubenswrapper[4819]: E0228 03:37:45.369095 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:45 crc kubenswrapper[4819]: E0228 03:37:45.369620 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:45 crc kubenswrapper[4819]: I0228 03:37:45.370606 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.368215 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:46 crc kubenswrapper[4819]: E0228 03:37:46.368616 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.368705 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:46 crc kubenswrapper[4819]: E0228 03:37:46.368921 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.369280 4819 scope.go:117] "RemoveContainer" containerID="5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.414063 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/3.log" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.418384 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerStarted","Data":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.419143 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.472700 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lbrtr"] Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.472873 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:46 crc kubenswrapper[4819]: E0228 03:37:46.473003 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:46 crc kubenswrapper[4819]: I0228 03:37:46.476750 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podStartSLOduration=145.476717494 podStartE2EDuration="2m25.476717494s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:46.473928845 +0000 UTC m=+204.939497733" watchObservedRunningTime="2026-02-28 03:37:46.476717494 +0000 UTC m=+204.942286412" Feb 28 03:37:47 crc kubenswrapper[4819]: I0228 03:37:47.368726 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:47 crc kubenswrapper[4819]: E0228 03:37:47.368975 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:47 crc kubenswrapper[4819]: I0228 03:37:47.422696 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/1.log" Feb 28 03:37:47 crc kubenswrapper[4819]: I0228 03:37:47.422793 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerStarted","Data":"2e7f8be7b64993d771c7dd876fa6a871ff577a0eb29ba3ede7b6b602e19a1fd5"} Feb 28 03:37:47 crc kubenswrapper[4819]: E0228 03:37:47.621604 4819 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:37:48 crc kubenswrapper[4819]: I0228 03:37:48.368838 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:48 crc kubenswrapper[4819]: I0228 03:37:48.368902 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:48 crc kubenswrapper[4819]: E0228 03:37:48.369041 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:48 crc kubenswrapper[4819]: I0228 03:37:48.369097 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:48 crc kubenswrapper[4819]: E0228 03:37:48.369356 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:48 crc kubenswrapper[4819]: E0228 03:37:48.369509 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:49 crc kubenswrapper[4819]: I0228 03:37:49.368719 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:49 crc kubenswrapper[4819]: E0228 03:37:49.369598 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:50 crc kubenswrapper[4819]: I0228 03:37:50.368088 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:50 crc kubenswrapper[4819]: I0228 03:37:50.368158 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:50 crc kubenswrapper[4819]: I0228 03:37:50.368158 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:50 crc kubenswrapper[4819]: E0228 03:37:50.368266 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:50 crc kubenswrapper[4819]: E0228 03:37:50.368376 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:50 crc kubenswrapper[4819]: E0228 03:37:50.368470 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:51 crc kubenswrapper[4819]: I0228 03:37:51.367992 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:51 crc kubenswrapper[4819]: E0228 03:37:51.368661 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 28 03:37:52 crc kubenswrapper[4819]: I0228 03:37:52.368372 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:52 crc kubenswrapper[4819]: I0228 03:37:52.368399 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:52 crc kubenswrapper[4819]: E0228 03:37:52.370967 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 28 03:37:52 crc kubenswrapper[4819]: I0228 03:37:52.371325 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:52 crc kubenswrapper[4819]: E0228 03:37:52.371357 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lbrtr" podUID="e7eede0c-6dc0-48ac-8065-7e0d9ed91212" Feb 28 03:37:52 crc kubenswrapper[4819]: E0228 03:37:52.371553 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.368181 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.370778 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.370956 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.897994 4819 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.947894 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2tc9v"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.948864 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.950528 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tdcjf"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.951289 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.953439 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b4blq"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.954183 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.956114 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.956492 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.956764 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.956988 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.957211 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.957625 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.957842 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.958160 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.958496 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.959078 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.959406 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.965001 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.965962 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.966047 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.966435 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.966908 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.967398 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.968182 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.968672 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.987959 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.988513 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.988600 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.988785 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.988939 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.989044 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.989160 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.989534 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.989659 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.990866 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.991028 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.991566 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.992452 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-trsfk"] Feb 28 03:37:53 crc kubenswrapper[4819]: I0228 03:37:53.993013 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.993288 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.993447 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7rkdb"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.993873 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.994031 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.996564 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.996801 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.997916 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bpmrd"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.998313 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.999071 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:53.999134 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.001122 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.003095 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.003730 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.004140 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.004319 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.004463 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.004627 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.004852 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.008373 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.008892 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.009046 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.009282 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.009516 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.009798 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.009938 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.010131 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.010507 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.010645 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.010774 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.010890 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.011035 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.011133 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.011195 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.011323 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.011341 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.011551 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.012087 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.012696 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.012876 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.013036 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w5fln"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.013339 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.013807 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.014078 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.014420 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.014522 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.015378 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.015482 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.015574 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.015645 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.017607 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-77ljw"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.018166 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.018686 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.019198 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.023875 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7nkqj"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.024471 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.025626 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.028183 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.028371 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.028911 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.029484 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.029606 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.029851 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.030060 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.030463 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.030582 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.030749 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.030910 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.028189 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.031356 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.031678 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.032068 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.032196 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.032484 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.032664 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.032822 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.033869 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.034050 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.034185 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.034617 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.036211 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.036629 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.037531 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.037814 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.039123 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.040803 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.067952 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.067971 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.069410 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.069488 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.069665 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.069769 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.069882 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.069976 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.070069 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.074383 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.075304 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.075303 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.075856 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.076287 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.076468 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.076979 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.077232 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.077300 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t5bm5"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.077633 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.078897 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.078914 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.081013 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-t9qxt"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.081460 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.082920 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.084296 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5w8xg"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.084866 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.096201 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.096703 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.097420 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.099090 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.099771 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.099925 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.100562 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.102975 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.109281 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.109945 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.110200 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6j222"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.110464 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.110778 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.114399 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.114578 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.114817 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.115016 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.115150 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.131885 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.132643 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.133450 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nb49n"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.133766 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.134087 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.134202 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.137678 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.138669 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.139998 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.141862 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.145768 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rtwrl"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.146818 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.149097 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537496-dqglv"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.150278 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.152769 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tdcjf"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.152852 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.154059 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.154603 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.155085 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.155479 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.157025 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-q8wxx"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.157686 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bpmrd"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.157762 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.158335 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.159276 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.160276 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-trsfk"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.161180 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.162149 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2tc9v"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.163074 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.163999 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b4blq"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.164943 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7rkdb"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.165864 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.166927 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t5bm5"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.167802 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.169924 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6j222"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170092 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170120 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6a78496-9606-4297-b022-286a969e9ea6-images\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170139 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-etcd-client\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170155 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb02855-da72-47a4-9456-3b3d9faf61a8-config\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170170 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvc56\" (UniqueName: \"kubernetes.io/projected/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-kube-api-access-dvc56\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170198 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6a78496-9606-4297-b022-286a969e9ea6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170218 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170359 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-client-ca\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170470 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-oauth-serving-cert\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170506 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-etcd-client\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170547 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-serving-cert\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170591 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/506dcbc2-d163-4d73-874c-c5eb62d75dd7-kube-api-access-z49df\") pod \"cluster-samples-operator-665b6dd947-szh85\" (UID: \"506dcbc2-d163-4d73-874c-c5eb62d75dd7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170643 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170671 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7cg6\" (UniqueName: \"kubernetes.io/projected/257ec8d9-46b8-445b-883d-cd842a4b8b61-kube-api-access-v7cg6\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170694 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-audit-dir\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170720 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-serving-cert\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170742 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2zn\" (UniqueName: \"kubernetes.io/projected/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-kube-api-access-nw2zn\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170772 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-etcd-serving-ca\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170836 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170867 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9sd\" (UniqueName: \"kubernetes.io/projected/f3590dc7-98a1-45cf-a420-f045d5d38335-kube-api-access-zd9sd\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170880 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170935 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fnx\" (UniqueName: \"kubernetes.io/projected/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-kube-api-access-76fnx\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170958 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-config\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.170977 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cb02855-da72-47a4-9456-3b3d9faf61a8-trusted-ca\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171006 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4fp\" (UniqueName: \"kubernetes.io/projected/1cb02855-da72-47a4-9456-3b3d9faf61a8-kube-api-access-gh4fp\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171022 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-service-ca\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171073 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b245dea2-5356-4952-9edf-11b68761e382-metrics-tls\") pod \"dns-operator-744455d44c-7rkdb\" (UID: \"b245dea2-5356-4952-9edf-11b68761e382\") " pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171100 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-audit-policies\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171118 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171144 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-policies\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171161 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171177 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9ng\" (UniqueName: \"kubernetes.io/projected/b6a78496-9606-4297-b022-286a969e9ea6-kube-api-access-9j9ng\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171197 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171212 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl7hc\" (UniqueName: \"kubernetes.io/projected/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-kube-api-access-tl7hc\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171227 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/304c4783-b723-4164-b000-8ae81986da3a-console-serving-cert\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171264 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e72012e1-89cc-4b65-ab38-78f09ec59ea4-config\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171280 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-client-ca\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171295 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171309 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-audit-dir\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171365 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171386 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171401 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-trusted-ca-bundle\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171422 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-encryption-config\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171438 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jmg\" (UniqueName: \"kubernetes.io/projected/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-kube-api-access-w2jmg\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171463 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171479 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/304c4783-b723-4164-b000-8ae81986da3a-console-oauth-config\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171493 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khd2f\" (UniqueName: \"kubernetes.io/projected/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-kube-api-access-khd2f\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171508 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e72012e1-89cc-4b65-ab38-78f09ec59ea4-machine-approver-tls\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171540 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nnl\" (UniqueName: \"kubernetes.io/projected/e72012e1-89cc-4b65-ab38-78f09ec59ea4-kube-api-access-g7nnl\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171557 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/506dcbc2-d163-4d73-874c-c5eb62d75dd7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-szh85\" (UID: \"506dcbc2-d163-4d73-874c-c5eb62d75dd7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171581 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-console-config\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171597 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171614 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171633 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzps\" (UniqueName: \"kubernetes.io/projected/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-kube-api-access-qrzps\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171648 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171702 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lt7t\" (UniqueName: \"kubernetes.io/projected/b245dea2-5356-4952-9edf-11b68761e382-kube-api-access-2lt7t\") pod \"dns-operator-744455d44c-7rkdb\" (UID: \"b245dea2-5356-4952-9edf-11b68761e382\") " pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171719 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-encryption-config\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171744 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171765 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171781 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-config\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171810 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sp99\" (UniqueName: \"kubernetes.io/projected/5e568a4f-33f8-447b-840c-dc560774878d-kube-api-access-5sp99\") pod \"control-plane-machine-set-operator-78cbb6b69f-qv8xl\" (UID: \"5e568a4f-33f8-447b-840c-dc560774878d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171829 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-serving-cert\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171853 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171874 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-audit\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171893 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-image-import-ca\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171915 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-dir\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171932 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171960 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171977 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.171992 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2bd\" (UniqueName: \"kubernetes.io/projected/56a73ad1-cc3f-445b-8e0a-d8ff6937ba57-kube-api-access-qt2bd\") pod \"downloads-7954f5f757-trsfk\" (UID: \"56a73ad1-cc3f-445b-8e0a-d8ff6937ba57\") " pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172051 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172068 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6cdd\" (UniqueName: \"kubernetes.io/projected/304c4783-b723-4164-b000-8ae81986da3a-kube-api-access-f6cdd\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172088 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a78496-9606-4297-b022-286a969e9ea6-config\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172104 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-serving-cert\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172118 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb02855-da72-47a4-9456-3b3d9faf61a8-serving-cert\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172138 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e72012e1-89cc-4b65-ab38-78f09ec59ea4-auth-proxy-config\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172153 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-config\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172190 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172208 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-config\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172222 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257ec8d9-46b8-445b-883d-cd842a4b8b61-serving-cert\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172261 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-serving-cert\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172281 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-node-pullsecrets\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172296 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqftk\" (UniqueName: \"kubernetes.io/projected/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-kube-api-access-cqftk\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172314 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172339 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e568a4f-33f8-447b-840c-dc560774878d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qv8xl\" (UID: \"5e568a4f-33f8-447b-840c-dc560774878d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172355 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172464 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w5fln"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.172769 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.173786 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.174764 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.175813 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.176853 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7nkqj"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.177937 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.178949 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.180001 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-77ljw"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.181166 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.182135 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.183157 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.184204 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c2zzz"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.184672 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.185281 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b72rs"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.187797 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.187816 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.187880 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.188445 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.189456 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rtwrl"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.190542 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.190756 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.192382 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5w8xg"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.193366 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c2zzz"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.194585 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nb49n"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.195732 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537496-dqglv"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.199660 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q8wxx"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.202205 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b72rs"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.203435 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.205230 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2s2sf"] Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.206758 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.210547 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.231197 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.251396 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.270664 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272743 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a78496-9606-4297-b022-286a969e9ea6-config\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272781 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-serving-cert\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272807 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb02855-da72-47a4-9456-3b3d9faf61a8-serving-cert\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272831 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e72012e1-89cc-4b65-ab38-78f09ec59ea4-auth-proxy-config\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272854 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-config\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272880 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dqwd\" (UniqueName: \"kubernetes.io/projected/7a723a02-07f6-42e8-8317-b05eef10e3d8-kube-api-access-7dqwd\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272905 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257ec8d9-46b8-445b-883d-cd842a4b8b61-serving-cert\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272929 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272951 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-config\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272977 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-serving-cert\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.272999 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-node-pullsecrets\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273024 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqftk\" (UniqueName: \"kubernetes.io/projected/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-kube-api-access-cqftk\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273048 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273071 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e568a4f-33f8-447b-840c-dc560774878d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qv8xl\" (UID: \"5e568a4f-33f8-447b-840c-dc560774878d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273095 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273118 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ed7e72bf-07cf-4643-80da-6d11f847a61b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273171 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273195 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6a78496-9606-4297-b022-286a969e9ea6-images\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273216 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-etcd-client\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273242 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb02855-da72-47a4-9456-3b3d9faf61a8-config\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273284 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvc56\" (UniqueName: \"kubernetes.io/projected/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-kube-api-access-dvc56\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273316 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6a78496-9606-4297-b022-286a969e9ea6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273337 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273366 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-client-ca\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273388 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-oauth-serving-cert\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273409 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-etcd-client\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273430 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-serving-cert\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273456 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/506dcbc2-d163-4d73-874c-c5eb62d75dd7-kube-api-access-z49df\") pod \"cluster-samples-operator-665b6dd947-szh85\" (UID: \"506dcbc2-d163-4d73-874c-c5eb62d75dd7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273474 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a78496-9606-4297-b022-286a969e9ea6-config\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273494 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7cg6\" (UniqueName: \"kubernetes.io/projected/257ec8d9-46b8-445b-883d-cd842a4b8b61-kube-api-access-v7cg6\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273518 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-audit-dir\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273537 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-node-pullsecrets\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273543 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273679 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-serving-cert\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273737 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2zn\" (UniqueName: \"kubernetes.io/projected/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-kube-api-access-nw2zn\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273760 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-etcd-serving-ca\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273784 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-metrics-certs\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273808 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a723a02-07f6-42e8-8317-b05eef10e3d8-service-ca-bundle\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273835 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273861 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9sd\" (UniqueName: \"kubernetes.io/projected/f3590dc7-98a1-45cf-a420-f045d5d38335-kube-api-access-zd9sd\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273884 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zktqr\" (UniqueName: \"kubernetes.io/projected/ed7e72bf-07cf-4643-80da-6d11f847a61b-kube-api-access-zktqr\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273911 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-service-ca\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.273936 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76fnx\" (UniqueName: \"kubernetes.io/projected/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-kube-api-access-76fnx\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274012 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-config\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274091 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cb02855-da72-47a4-9456-3b3d9faf61a8-trusted-ca\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274125 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4fp\" (UniqueName: \"kubernetes.io/projected/1cb02855-da72-47a4-9456-3b3d9faf61a8-kube-api-access-gh4fp\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274163 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b245dea2-5356-4952-9edf-11b68761e382-metrics-tls\") pod \"dns-operator-744455d44c-7rkdb\" (UID: \"b245dea2-5356-4952-9edf-11b68761e382\") " pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274192 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-audit-policies\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274219 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274270 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-policies\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274300 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274342 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9ng\" (UniqueName: \"kubernetes.io/projected/b6a78496-9606-4297-b022-286a969e9ea6-kube-api-access-9j9ng\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274369 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274398 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl7hc\" (UniqueName: \"kubernetes.io/projected/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-kube-api-access-tl7hc\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274425 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/304c4783-b723-4164-b000-8ae81986da3a-console-serving-cert\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274449 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e72012e1-89cc-4b65-ab38-78f09ec59ea4-config\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274474 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-client-ca\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274502 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274525 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-audit-dir\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274547 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-trusted-ca-bundle\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274579 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-stats-auth\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274626 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274652 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274681 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-encryption-config\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274703 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jmg\" (UniqueName: \"kubernetes.io/projected/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-kube-api-access-w2jmg\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274731 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274754 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/304c4783-b723-4164-b000-8ae81986da3a-console-oauth-config\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274777 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khd2f\" (UniqueName: \"kubernetes.io/projected/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-kube-api-access-khd2f\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274802 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e72012e1-89cc-4b65-ab38-78f09ec59ea4-machine-approver-tls\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274834 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nnl\" (UniqueName: \"kubernetes.io/projected/e72012e1-89cc-4b65-ab38-78f09ec59ea4-kube-api-access-g7nnl\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274865 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274869 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/506dcbc2-d163-4d73-874c-c5eb62d75dd7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-szh85\" (UID: \"506dcbc2-d163-4d73-874c-c5eb62d75dd7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274918 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-console-config\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274939 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274959 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.274976 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275001 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzps\" (UniqueName: \"kubernetes.io/projected/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-kube-api-access-qrzps\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275021 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lt7t\" (UniqueName: \"kubernetes.io/projected/b245dea2-5356-4952-9edf-11b68761e382-kube-api-access-2lt7t\") pod \"dns-operator-744455d44c-7rkdb\" (UID: \"b245dea2-5356-4952-9edf-11b68761e382\") " pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275036 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-encryption-config\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275053 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275070 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275088 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-config\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275106 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-serving-cert\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275128 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sp99\" (UniqueName: \"kubernetes.io/projected/5e568a4f-33f8-447b-840c-dc560774878d-kube-api-access-5sp99\") pod \"control-plane-machine-set-operator-78cbb6b69f-qv8xl\" (UID: \"5e568a4f-33f8-447b-840c-dc560774878d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275148 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275165 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-audit\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275180 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-image-import-ca\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275197 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-dir\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275213 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275229 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2bd\" (UniqueName: \"kubernetes.io/projected/56a73ad1-cc3f-445b-8e0a-d8ff6937ba57-kube-api-access-qt2bd\") pod \"downloads-7954f5f757-trsfk\" (UID: \"56a73ad1-cc3f-445b-8e0a-d8ff6937ba57\") " pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275270 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275289 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275305 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275306 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-config\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275333 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6cdd\" (UniqueName: \"kubernetes.io/projected/304c4783-b723-4164-b000-8ae81986da3a-kube-api-access-f6cdd\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275355 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-default-certificate\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.275411 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-config\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.276018 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-console-config\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.276414 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e72012e1-89cc-4b65-ab38-78f09ec59ea4-auth-proxy-config\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.276451 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-service-ca-bundle\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.276682 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.277191 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.277919 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-serving-cert\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.277986 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-audit-dir\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.278023 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-etcd-serving-ca\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.278079 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cb02855-da72-47a4-9456-3b3d9faf61a8-serving-cert\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.278467 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/506dcbc2-d163-4d73-874c-c5eb62d75dd7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-szh85\" (UID: \"506dcbc2-d163-4d73-874c-c5eb62d75dd7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.278775 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-client-ca\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.278809 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-etcd-client\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.278990 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-serving-cert\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.279068 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257ec8d9-46b8-445b-883d-cd842a4b8b61-serving-cert\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.279502 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.279815 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-config\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.280630 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.280641 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.280783 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-audit-dir\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.280831 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cb02855-da72-47a4-9456-3b3d9faf61a8-trusted-ca\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.281158 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-serving-cert\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.281762 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cb02855-da72-47a4-9456-3b3d9faf61a8-config\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.281826 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-trusted-ca-bundle\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.281863 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-serving-cert\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.282001 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-service-ca\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.282115 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.282283 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-dir\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.282453 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-image-import-ca\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.282590 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-audit-policies\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.283934 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284019 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284293 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284378 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-serving-cert\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284510 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284608 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284698 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284701 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284930 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-audit\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.284936 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-policies\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285179 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285224 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-client-ca\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285539 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6a78496-9606-4297-b022-286a969e9ea6-images\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285670 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6a78496-9606-4297-b022-286a969e9ea6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285745 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-encryption-config\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285801 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/304c4783-b723-4164-b000-8ae81986da3a-oauth-serving-cert\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.285822 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e72012e1-89cc-4b65-ab38-78f09ec59ea4-config\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.286031 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b245dea2-5356-4952-9edf-11b68761e382-metrics-tls\") pod \"dns-operator-744455d44c-7rkdb\" (UID: \"b245dea2-5356-4952-9edf-11b68761e382\") " pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.286144 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.286935 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.286971 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.286973 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-config\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.287285 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e72012e1-89cc-4b65-ab38-78f09ec59ea4-machine-approver-tls\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.288215 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-encryption-config\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.288272 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.288359 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-etcd-client\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.288394 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e568a4f-33f8-447b-840c-dc560774878d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qv8xl\" (UID: \"5e568a4f-33f8-447b-840c-dc560774878d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.288741 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/304c4783-b723-4164-b000-8ae81986da3a-console-oauth-config\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.288891 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.289526 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.290025 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.291466 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.292019 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/304c4783-b723-4164-b000-8ae81986da3a-console-serving-cert\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.294004 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.310795 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.331036 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.350680 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.368214 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.368450 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.368469 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.371157 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376025 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-stats-auth\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376288 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-default-certificate\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376419 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dqwd\" (UniqueName: \"kubernetes.io/projected/7a723a02-07f6-42e8-8317-b05eef10e3d8-kube-api-access-7dqwd\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376588 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ed7e72bf-07cf-4643-80da-6d11f847a61b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376746 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-metrics-certs\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376876 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zktqr\" (UniqueName: \"kubernetes.io/projected/ed7e72bf-07cf-4643-80da-6d11f847a61b-kube-api-access-zktqr\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.376973 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a723a02-07f6-42e8-8317-b05eef10e3d8-service-ca-bundle\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.391465 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.411696 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.431998 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.450503 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.471147 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.491172 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.511314 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.530849 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.551425 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.571231 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.581061 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-default-certificate\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.591821 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.602952 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-metrics-certs\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.611856 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.631658 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.638102 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a723a02-07f6-42e8-8317-b05eef10e3d8-service-ca-bundle\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.651740 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.660397 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7a723a02-07f6-42e8-8317-b05eef10e3d8-stats-auth\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.673056 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.693079 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.711588 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.732164 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.751778 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.781300 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.791198 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.811311 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.832126 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.850843 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.871629 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.892380 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.911525 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.942610 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.951582 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.972536 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 28 03:37:54 crc kubenswrapper[4819]: I0228 03:37:54.992431 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.011648 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.032477 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.051469 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.071371 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.091441 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.111387 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.130086 4819 request.go:700] Waited for 1.015403669s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dcluster-image-registry-operator-dockercfg-m4qtx&limit=500&resourceVersion=0 Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.132656 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.152744 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.171470 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.216629 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.217030 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.231142 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.251968 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.270774 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.290973 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.311179 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.331874 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.352349 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.371312 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: E0228 03:37:55.377076 4819 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Feb 28 03:37:55 crc kubenswrapper[4819]: E0228 03:37:55.377153 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7e72bf-07cf-4643-80da-6d11f847a61b-webhook-certs podName:ed7e72bf-07cf-4643-80da-6d11f847a61b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:55.877135101 +0000 UTC m=+214.342703959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ed7e72bf-07cf-4643-80da-6d11f847a61b-webhook-certs") pod "multus-admission-controller-857f4d67dd-rtwrl" (UID: "ed7e72bf-07cf-4643-80da-6d11f847a61b") : failed to sync secret cache: timed out waiting for the condition Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.392046 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.411128 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.430814 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.452399 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.471948 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.504228 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.512053 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.532505 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.560997 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.571470 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.591853 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.611978 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.632082 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.651879 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.671556 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.692382 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.711652 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.732649 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.752410 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.771719 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.791906 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.811665 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.872364 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.891392 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.907457 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ed7e72bf-07cf-4643-80da-6d11f847a61b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.911863 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.912338 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ed7e72bf-07cf-4643-80da-6d11f847a61b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.932781 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.951396 4819 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.971793 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 28 03:37:55 crc kubenswrapper[4819]: I0228 03:37:55.992457 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.012355 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.033098 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.052405 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.105200 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqftk\" (UniqueName: \"kubernetes.io/projected/ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8-kube-api-access-cqftk\") pod \"openshift-config-operator-7777fb866f-bj4zw\" (UID: \"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.118565 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fnx\" (UniqueName: \"kubernetes.io/projected/ac6ce0f1-1683-417d-8a4b-aa5067e21b2b-kube-api-access-76fnx\") pod \"openshift-controller-manager-operator-756b6f6bc6-pgp72\" (UID: \"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.140407 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzps\" (UniqueName: \"kubernetes.io/projected/2e0fbd3f-cc84-4c5d-b4ba-116268fc0625-kube-api-access-qrzps\") pod \"openshift-apiserver-operator-796bbdcf4f-h49g8\" (UID: \"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.149762 4819 request.go:700] Waited for 1.871650677s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.161699 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lt7t\" (UniqueName: \"kubernetes.io/projected/b245dea2-5356-4952-9edf-11b68761e382-kube-api-access-2lt7t\") pod \"dns-operator-744455d44c-7rkdb\" (UID: \"b245dea2-5356-4952-9edf-11b68761e382\") " pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.167610 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49df\" (UniqueName: \"kubernetes.io/projected/506dcbc2-d163-4d73-874c-c5eb62d75dd7-kube-api-access-z49df\") pod \"cluster-samples-operator-665b6dd947-szh85\" (UID: \"506dcbc2-d163-4d73-874c-c5eb62d75dd7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.199603 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7cg6\" (UniqueName: \"kubernetes.io/projected/257ec8d9-46b8-445b-883d-cd842a4b8b61-kube-api-access-v7cg6\") pod \"route-controller-manager-6576b87f9c-9z6bd\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.212000 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9sd\" (UniqueName: \"kubernetes.io/projected/f3590dc7-98a1-45cf-a420-f045d5d38335-kube-api-access-zd9sd\") pod \"oauth-openshift-558db77b4-w5fln\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.227971 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jmg\" (UniqueName: \"kubernetes.io/projected/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-kube-api-access-w2jmg\") pod \"controller-manager-879f6c89f-b4blq\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.232558 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.250693 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sp99\" (UniqueName: \"kubernetes.io/projected/5e568a4f-33f8-447b-840c-dc560774878d-kube-api-access-5sp99\") pod \"control-plane-machine-set-operator-78cbb6b69f-qv8xl\" (UID: \"5e568a4f-33f8-447b-840c-dc560774878d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.258372 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.264221 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.270166 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khd2f\" (UniqueName: \"kubernetes.io/projected/ed79a80c-ec3b-4446-9ad3-4e1906715cd7-kube-api-access-khd2f\") pod \"apiserver-76f77b778f-2tc9v\" (UID: \"ed79a80c-ec3b-4446-9ad3-4e1906715cd7\") " pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.278922 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.284321 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.290317 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4fp\" (UniqueName: \"kubernetes.io/projected/1cb02855-da72-47a4-9456-3b3d9faf61a8-kube-api-access-gh4fp\") pod \"console-operator-58897d9998-bpmrd\" (UID: \"1cb02855-da72-47a4-9456-3b3d9faf61a8\") " pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.290453 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.301135 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.305805 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9ng\" (UniqueName: \"kubernetes.io/projected/b6a78496-9606-4297-b022-286a969e9ea6-kube-api-access-9j9ng\") pod \"machine-api-operator-5694c8668f-tdcjf\" (UID: \"b6a78496-9606-4297-b022-286a969e9ea6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.325609 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.326285 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvc56\" (UniqueName: \"kubernetes.io/projected/5e0c869d-a7dc-4145-af67-3eac7eb8f1a9-kube-api-access-dvc56\") pod \"kube-storage-version-migrator-operator-b67b599dd-v629x\" (UID: \"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.352919 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl7hc\" (UniqueName: \"kubernetes.io/projected/5b1cfdf7-8bbd-4913-9786-2a71cf6baec1-kube-api-access-tl7hc\") pod \"apiserver-7bbb656c7d-gjpcj\" (UID: \"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.375433 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2bd\" (UniqueName: \"kubernetes.io/projected/56a73ad1-cc3f-445b-8e0a-d8ff6937ba57-kube-api-access-qt2bd\") pod \"downloads-7954f5f757-trsfk\" (UID: \"56a73ad1-cc3f-445b-8e0a-d8ff6937ba57\") " pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.390762 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nnl\" (UniqueName: \"kubernetes.io/projected/e72012e1-89cc-4b65-ab38-78f09ec59ea4-kube-api-access-g7nnl\") pod \"machine-approver-56656f9798-qcfds\" (UID: \"e72012e1-89cc-4b65-ab38-78f09ec59ea4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.390976 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.415173 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.415379 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2zn\" (UniqueName: \"kubernetes.io/projected/e6c1e24e-7367-4030-bbe3-0c5a2595ba4a-kube-api-access-nw2zn\") pod \"authentication-operator-69f744f599-7nkqj\" (UID: \"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.420145 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.425257 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6cdd\" (UniqueName: \"kubernetes.io/projected/304c4783-b723-4164-b000-8ae81986da3a-kube-api-access-f6cdd\") pod \"console-f9d7485db-77ljw\" (UID: \"304c4783-b723-4164-b000-8ae81986da3a\") " pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.431270 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.434740 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.451490 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.472472 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.473957 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.490708 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.490948 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" event={"ID":"e72012e1-89cc-4b65-ab38-78f09ec59ea4","Type":"ContainerStarted","Data":"de4ebaa718df81e1902d557832116e9db9518a2b75d2ec108e1a18990d422d0d"} Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.526034 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dqwd\" (UniqueName: \"kubernetes.io/projected/7a723a02-07f6-42e8-8317-b05eef10e3d8-kube-api-access-7dqwd\") pod \"router-default-5444994796-t9qxt\" (UID: \"7a723a02-07f6-42e8-8317-b05eef10e3d8\") " pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.544062 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.554934 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zktqr\" (UniqueName: \"kubernetes.io/projected/ed7e72bf-07cf-4643-80da-6d11f847a61b-kube-api-access-zktqr\") pod \"multus-admission-controller-857f4d67dd-rtwrl\" (UID: \"ed7e72bf-07cf-4643-80da-6d11f847a61b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.570851 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.605974 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.612548 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619002 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a452d812-570e-4a9a-a473-c7bfa1daffe1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619075 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619107 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a12e5e4-9129-4a00-ae0c-684869c0cff7-secret-volume\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619123 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-srv-cert\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619139 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-metrics-tls\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619164 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55ccad0a-8169-4f92-89e1-6b8db13f255b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619180 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c8fc9078-b36f-49a3-b53c-67f60a904e8d-signing-key\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619194 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hdh\" (UniqueName: \"kubernetes.io/projected/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-kube-api-access-t5hdh\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619218 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwm4t\" (UniqueName: \"kubernetes.io/projected/3a12e5e4-9129-4a00-ae0c-684869c0cff7-kube-api-access-bwm4t\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619264 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-bound-sa-token\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619300 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-serving-cert\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619318 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619335 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619349 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a19bd4f6-2bd3-404f-9282-89a12444562f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619372 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlnl4\" (UniqueName: \"kubernetes.io/projected/a452d812-570e-4a9a-a473-c7bfa1daffe1-kube-api-access-hlnl4\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619388 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7m2k\" (UniqueName: \"kubernetes.io/projected/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-kube-api-access-s7m2k\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619403 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-client\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619416 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-webhook-cert\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619440 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a12e5e4-9129-4a00-ae0c-684869c0cff7-config-volume\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619477 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce43b92c-92e6-45b3-abf9-20db64e0ec05-srv-cert\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.619492 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55ccad0a-8169-4f92-89e1-6b8db13f255b-images\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.620585 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.621281 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.621601 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-certificates\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.621643 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzsvq\" (UniqueName: \"kubernetes.io/projected/0a08aebb-db7e-488c-b992-2286ba6c9fd0-kube-api-access-mzsvq\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.621778 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b61e92-8bbd-40e0-96ee-b3bacd20950d-serving-cert\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.621835 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622040 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-config\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622091 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-config\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622110 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622167 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19bd4f6-2bd3-404f-9282-89a12444562f-config\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622374 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f94n4\" (UniqueName: \"kubernetes.io/projected/c8fc9078-b36f-49a3-b53c-67f60a904e8d-kube-api-access-f94n4\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622408 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d3c101-bb83-41b0-88b0-a09b6135d7d8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622440 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4mp8\" (UniqueName: \"kubernetes.io/projected/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-kube-api-access-r4mp8\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622479 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622526 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-trusted-ca\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622581 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55ccad0a-8169-4f92-89e1-6b8db13f255b-proxy-tls\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622605 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a19bd4f6-2bd3-404f-9282-89a12444562f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622842 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622864 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-config-volume\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.622949 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-tls\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623077 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-ca\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623116 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623183 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-tmpfs\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623219 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49vs\" (UniqueName: \"kubernetes.io/projected/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-kube-api-access-j49vs\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623240 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce43b92c-92e6-45b3-abf9-20db64e0ec05-profile-collector-cert\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623279 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623317 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623332 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbqcs\" (UniqueName: \"kubernetes.io/projected/47b61e92-8bbd-40e0-96ee-b3bacd20950d-kube-api-access-bbqcs\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623355 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c8fc9078-b36f-49a3-b53c-67f60a904e8d-signing-cabundle\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623513 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wgm\" (UniqueName: \"kubernetes.io/projected/0bba4110-e884-4369-8e1e-676a7afed536-kube-api-access-44wgm\") pod \"package-server-manager-789f6589d5-5pnmf\" (UID: \"0bba4110-e884-4369-8e1e-676a7afed536\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623535 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-service-ca\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.623998 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcwjt\" (UniqueName: \"kubernetes.io/projected/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-kube-api-access-gcwjt\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.624088 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-metrics-tls\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.624225 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-trusted-ca\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.624984 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d3c101-bb83-41b0-88b0-a09b6135d7d8-config\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625028 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82dc\" (UniqueName: \"kubernetes.io/projected/23ec7136-9dc9-47c2-bf41-7b798e6bfe60-kube-api-access-w82dc\") pod \"auto-csr-approver-29537496-dqglv\" (UID: \"23ec7136-9dc9-47c2-bf41-7b798e6bfe60\") " pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625050 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6k5k\" (UniqueName: \"kubernetes.io/projected/ce43b92c-92e6-45b3-abf9-20db64e0ec05-kube-api-access-f6k5k\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625067 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbp7\" (UniqueName: \"kubernetes.io/projected/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-kube-api-access-rmbp7\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625261 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625293 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7n9\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-kube-api-access-vf7n9\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625319 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625349 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bba4110-e884-4369-8e1e-676a7afed536-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5pnmf\" (UID: \"0bba4110-e884-4369-8e1e-676a7afed536\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625366 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9st68\" (UniqueName: \"kubernetes.io/projected/55ccad0a-8169-4f92-89e1-6b8db13f255b-kube-api-access-9st68\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625389 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625407 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlv7\" (UniqueName: \"kubernetes.io/projected/dea5b1f1-4b1d-4741-be96-ec4c56b3c3db-kube-api-access-tjlv7\") pod \"migrator-59844c95c7-b7z5m\" (UID: \"dea5b1f1-4b1d-4741-be96-ec4c56b3c3db\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625423 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a452d812-570e-4a9a-a473-c7bfa1daffe1-proxy-tls\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.625454 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30d3c101-bb83-41b0-88b0-a09b6135d7d8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: E0228 03:37:56.625735 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.125725669 +0000 UTC m=+215.591294527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.666532 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.727756 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.727901 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwqq9\" (UniqueName: \"kubernetes.io/projected/66425eb0-e8aa-44b7-b316-10949f9cc414-kube-api-access-gwqq9\") pod \"ingress-canary-c2zzz\" (UID: \"66425eb0-e8aa-44b7-b316-10949f9cc414\") " pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.727931 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wgm\" (UniqueName: \"kubernetes.io/projected/0bba4110-e884-4369-8e1e-676a7afed536-kube-api-access-44wgm\") pod \"package-server-manager-789f6589d5-5pnmf\" (UID: \"0bba4110-e884-4369-8e1e-676a7afed536\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.727950 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-service-ca\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.727976 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-metrics-tls\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.727991 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcwjt\" (UniqueName: \"kubernetes.io/projected/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-kube-api-access-gcwjt\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728008 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-node-bootstrap-token\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728025 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-trusted-ca\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728049 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d3c101-bb83-41b0-88b0-a09b6135d7d8-config\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728063 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-certs\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728080 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728095 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf7n9\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-kube-api-access-vf7n9\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728111 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82dc\" (UniqueName: \"kubernetes.io/projected/23ec7136-9dc9-47c2-bf41-7b798e6bfe60-kube-api-access-w82dc\") pod \"auto-csr-approver-29537496-dqglv\" (UID: \"23ec7136-9dc9-47c2-bf41-7b798e6bfe60\") " pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728126 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6k5k\" (UniqueName: \"kubernetes.io/projected/ce43b92c-92e6-45b3-abf9-20db64e0ec05-kube-api-access-f6k5k\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728142 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbp7\" (UniqueName: \"kubernetes.io/projected/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-kube-api-access-rmbp7\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728160 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728175 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bba4110-e884-4369-8e1e-676a7afed536-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5pnmf\" (UID: \"0bba4110-e884-4369-8e1e-676a7afed536\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728199 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlv7\" (UniqueName: \"kubernetes.io/projected/dea5b1f1-4b1d-4741-be96-ec4c56b3c3db-kube-api-access-tjlv7\") pod \"migrator-59844c95c7-b7z5m\" (UID: \"dea5b1f1-4b1d-4741-be96-ec4c56b3c3db\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728214 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9st68\" (UniqueName: \"kubernetes.io/projected/55ccad0a-8169-4f92-89e1-6b8db13f255b-kube-api-access-9st68\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728232 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-csi-data-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728309 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30d3c101-bb83-41b0-88b0-a09b6135d7d8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728326 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a452d812-570e-4a9a-a473-c7bfa1daffe1-proxy-tls\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728341 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66425eb0-e8aa-44b7-b316-10949f9cc414-cert\") pod \"ingress-canary-c2zzz\" (UID: \"66425eb0-e8aa-44b7-b316-10949f9cc414\") " pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728360 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a452d812-570e-4a9a-a473-c7bfa1daffe1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728383 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728415 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-metrics-tls\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728431 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a12e5e4-9129-4a00-ae0c-684869c0cff7-secret-volume\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.728980 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-srv-cert\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729006 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55ccad0a-8169-4f92-89e1-6b8db13f255b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729024 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c8fc9078-b36f-49a3-b53c-67f60a904e8d-signing-key\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729040 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hdh\" (UniqueName: \"kubernetes.io/projected/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-kube-api-access-t5hdh\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729067 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-bound-sa-token\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729091 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwm4t\" (UniqueName: \"kubernetes.io/projected/3a12e5e4-9129-4a00-ae0c-684869c0cff7-kube-api-access-bwm4t\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729108 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-serving-cert\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729121 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729154 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729169 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a19bd4f6-2bd3-404f-9282-89a12444562f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729184 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlnl4\" (UniqueName: \"kubernetes.io/projected/a452d812-570e-4a9a-a473-c7bfa1daffe1-kube-api-access-hlnl4\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729199 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7m2k\" (UniqueName: \"kubernetes.io/projected/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-kube-api-access-s7m2k\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729224 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-client\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729238 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-webhook-cert\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729286 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a12e5e4-9129-4a00-ae0c-684869c0cff7-config-volume\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729304 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55ccad0a-8169-4f92-89e1-6b8db13f255b-images\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729318 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce43b92c-92e6-45b3-abf9-20db64e0ec05-srv-cert\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729335 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-plugins-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729361 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729400 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-certificates\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729416 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzsvq\" (UniqueName: \"kubernetes.io/projected/0a08aebb-db7e-488c-b992-2286ba6c9fd0-kube-api-access-mzsvq\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729432 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzcf\" (UniqueName: \"kubernetes.io/projected/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-kube-api-access-qbzcf\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729457 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b61e92-8bbd-40e0-96ee-b3bacd20950d-serving-cert\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729476 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729513 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-config\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729539 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-config\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729553 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729568 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19bd4f6-2bd3-404f-9282-89a12444562f-config\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729583 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f94n4\" (UniqueName: \"kubernetes.io/projected/c8fc9078-b36f-49a3-b53c-67f60a904e8d-kube-api-access-f94n4\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729598 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d3c101-bb83-41b0-88b0-a09b6135d7d8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729613 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4mp8\" (UniqueName: \"kubernetes.io/projected/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-kube-api-access-r4mp8\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729645 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-trusted-ca\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729661 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729677 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55ccad0a-8169-4f92-89e1-6b8db13f255b-proxy-tls\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729691 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a19bd4f6-2bd3-404f-9282-89a12444562f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729708 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x65gw\" (UniqueName: \"kubernetes.io/projected/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-kube-api-access-x65gw\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729725 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-tls\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729741 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729755 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-config-volume\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729794 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729813 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-ca\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729829 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-socket-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729853 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-tmpfs\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729869 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-registration-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729885 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49vs\" (UniqueName: \"kubernetes.io/projected/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-kube-api-access-j49vs\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729901 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-mountpoint-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729929 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce43b92c-92e6-45b3-abf9-20db64e0ec05-profile-collector-cert\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729944 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729962 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-service-ca\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729979 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.729997 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbqcs\" (UniqueName: \"kubernetes.io/projected/47b61e92-8bbd-40e0-96ee-b3bacd20950d-kube-api-access-bbqcs\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.730024 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c8fc9078-b36f-49a3-b53c-67f60a904e8d-signing-cabundle\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.730686 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c8fc9078-b36f-49a3-b53c-67f60a904e8d-signing-cabundle\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.733437 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-metrics-tls\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.737594 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a12e5e4-9129-4a00-ae0c-684869c0cff7-secret-volume\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.738727 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55ccad0a-8169-4f92-89e1-6b8db13f255b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.743025 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.743062 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.743071 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.743219 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a452d812-570e-4a9a-a473-c7bfa1daffe1-proxy-tls\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.744762 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.745146 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7rkdb"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.745726 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-config-volume\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.746188 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-ca\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.746874 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-trusted-ca\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.747497 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.747537 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30d3c101-bb83-41b0-88b0-a09b6135d7d8-config\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.747672 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bba4110-e884-4369-8e1e-676a7afed536-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5pnmf\" (UID: \"0bba4110-e884-4369-8e1e-676a7afed536\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.747844 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-tmpfs\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.749893 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.750132 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.750603 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c8fc9078-b36f-49a3-b53c-67f60a904e8d-signing-key\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.751157 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55ccad0a-8169-4f92-89e1-6b8db13f255b-images\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.754346 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.755019 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b61e92-8bbd-40e0-96ee-b3bacd20950d-config\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.756134 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-tls\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.757168 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a12e5e4-9129-4a00-ae0c-684869c0cff7-config-volume\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.757591 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.758290 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/47b61e92-8bbd-40e0-96ee-b3bacd20950d-etcd-client\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.758364 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce43b92c-92e6-45b3-abf9-20db64e0ec05-profile-collector-cert\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.758710 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a19bd4f6-2bd3-404f-9282-89a12444562f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.759856 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19bd4f6-2bd3-404f-9282-89a12444562f-config\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.760185 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.760888 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: E0228 03:37:56.760890 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.260870241 +0000 UTC m=+215.726439159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.761013 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-serving-cert\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.761723 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-trusted-ca\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.761910 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-certificates\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.762170 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a452d812-570e-4a9a-a473-c7bfa1daffe1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.764714 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.765731 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-config\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.767029 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30d3c101-bb83-41b0-88b0-a09b6135d7d8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.767677 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-srv-cert\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.768712 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.769909 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b61e92-8bbd-40e0-96ee-b3bacd20950d-serving-cert\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.770019 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce43b92c-92e6-45b3-abf9-20db64e0ec05-srv-cert\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.770623 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-webhook-cert\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.772836 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-metrics-tls\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.776148 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55ccad0a-8169-4f92-89e1-6b8db13f255b-proxy-tls\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.776538 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6k5k\" (UniqueName: \"kubernetes.io/projected/ce43b92c-92e6-45b3-abf9-20db64e0ec05-kube-api-access-f6k5k\") pod \"catalog-operator-68c6474976-mxlcb\" (UID: \"ce43b92c-92e6-45b3-abf9-20db64e0ec05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.793588 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wgm\" (UniqueName: \"kubernetes.io/projected/0bba4110-e884-4369-8e1e-676a7afed536-kube-api-access-44wgm\") pod \"package-server-manager-789f6589d5-5pnmf\" (UID: \"0bba4110-e884-4369-8e1e-676a7afed536\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.808002 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcwjt\" (UniqueName: \"kubernetes.io/projected/f70ddc86-ef85-47d5-a9de-2c2a314ca39b-kube-api-access-gcwjt\") pod \"dns-default-q8wxx\" (UID: \"f70ddc86-ef85-47d5-a9de-2c2a314ca39b\") " pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831081 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzcf\" (UniqueName: \"kubernetes.io/projected/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-kube-api-access-qbzcf\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831164 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x65gw\" (UniqueName: \"kubernetes.io/projected/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-kube-api-access-x65gw\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831213 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-socket-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831265 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-registration-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831299 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-mountpoint-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831339 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwqq9\" (UniqueName: \"kubernetes.io/projected/66425eb0-e8aa-44b7-b316-10949f9cc414-kube-api-access-gwqq9\") pod \"ingress-canary-c2zzz\" (UID: \"66425eb0-e8aa-44b7-b316-10949f9cc414\") " pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831376 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-node-bootstrap-token\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831407 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-certs\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831473 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831517 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-csi-data-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831554 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66425eb0-e8aa-44b7-b316-10949f9cc414-cert\") pod \"ingress-canary-c2zzz\" (UID: \"66425eb0-e8aa-44b7-b316-10949f9cc414\") " pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.831658 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-plugins-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.832006 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-plugins-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.832099 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-socket-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.832171 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-registration-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.832206 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-mountpoint-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: E0228 03:37:56.832495 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.332481452 +0000 UTC m=+215.798050300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.832948 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-csi-data-dir\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.833162 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.838190 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66425eb0-e8aa-44b7-b316-10949f9cc414-cert\") pod \"ingress-canary-c2zzz\" (UID: \"66425eb0-e8aa-44b7-b316-10949f9cc414\") " pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.840348 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.840886 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-node-bootstrap-token\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.841532 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-certs\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.849865 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbp7\" (UniqueName: \"kubernetes.io/projected/ef03f8a2-9dde-4ee7-af68-2693ddf59fb4-kube-api-access-rmbp7\") pod \"service-ca-operator-777779d784-w5n5t\" (UID: \"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.869674 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w5fln"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.872262 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75zxt\" (UID: \"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.878405 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.888133 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlv7\" (UniqueName: \"kubernetes.io/projected/dea5b1f1-4b1d-4741-be96-ec4c56b3c3db-kube-api-access-tjlv7\") pod \"migrator-59844c95c7-b7z5m\" (UID: \"dea5b1f1-4b1d-4741-be96-ec4c56b3c3db\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.888508 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q8wxx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.898438 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-trsfk"] Feb 28 03:37:56 crc kubenswrapper[4819]: W0228 03:37:56.903918 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e568a4f_33f8_447b_840c_dc560774878d.slice/crio-1012b95f74b2f6184dad15b63b524fef83ee042b0c64b456571dc6e6bec48c3e WatchSource:0}: Error finding container 1012b95f74b2f6184dad15b63b524fef83ee042b0c64b456571dc6e6bec48c3e: Status 404 returned error can't find the container with id 1012b95f74b2f6184dad15b63b524fef83ee042b0c64b456571dc6e6bec48c3e Feb 28 03:37:56 crc kubenswrapper[4819]: W0228 03:37:56.915131 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56a73ad1_cc3f_445b_8e0a_d8ff6937ba57.slice/crio-c394f87d6830f9554f58fe7d6db33ebfe27281850bac149f6684896848d5541f WatchSource:0}: Error finding container c394f87d6830f9554f58fe7d6db33ebfe27281850bac149f6684896848d5541f: Status 404 returned error can't find the container with id c394f87d6830f9554f58fe7d6db33ebfe27281850bac149f6684896848d5541f Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.936014 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2tc9v"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.936086 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.936101 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd"] Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.937435 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.941514 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30d3c101-bb83-41b0-88b0-a09b6135d7d8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dc26n\" (UID: \"30d3c101-bb83-41b0-88b0-a09b6135d7d8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.941622 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9st68\" (UniqueName: \"kubernetes.io/projected/55ccad0a-8169-4f92-89e1-6b8db13f255b-kube-api-access-9st68\") pod \"machine-config-operator-74547568cd-wtnpx\" (UID: \"55ccad0a-8169-4f92-89e1-6b8db13f255b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.942079 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.942173 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" Feb 28 03:37:56 crc kubenswrapper[4819]: E0228 03:37:56.942623 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.442603531 +0000 UTC m=+215.908172389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.959684 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49vs\" (UniqueName: \"kubernetes.io/projected/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-kube-api-access-j49vs\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.983669 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hdh\" (UniqueName: \"kubernetes.io/projected/c1530a3b-acb7-4a9a-bc2b-339db29fa05f-kube-api-access-t5hdh\") pod \"olm-operator-6b444d44fb-pg9r5\" (UID: \"c1530a3b-acb7-4a9a-bc2b-339db29fa05f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:56 crc kubenswrapper[4819]: I0228 03:37:56.990746 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlnl4\" (UniqueName: \"kubernetes.io/projected/a452d812-570e-4a9a-a473-c7bfa1daffe1-kube-api-access-hlnl4\") pod \"machine-config-controller-84d6567774-s2mhl\" (UID: \"a452d812-570e-4a9a-a473-c7bfa1daffe1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.007605 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7m2k\" (UniqueName: \"kubernetes.io/projected/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-kube-api-access-s7m2k\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.020074 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tdcjf"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.029885 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b4blq"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.032921 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf7n9\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-kube-api-access-vf7n9\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.044845 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:57 crc kubenswrapper[4819]: E0228 03:37:57.045146 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.545134762 +0000 UTC m=+216.010703620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.055867 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.061101 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82dc\" (UniqueName: \"kubernetes.io/projected/23ec7136-9dc9-47c2-bf41-7b798e6bfe60-kube-api-access-w82dc\") pod \"auto-csr-approver-29537496-dqglv\" (UID: \"23ec7136-9dc9-47c2-bf41-7b798e6bfe60\") " pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.065521 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.071043 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbqcs\" (UniqueName: \"kubernetes.io/projected/47b61e92-8bbd-40e0-96ee-b3bacd20950d-kube-api-access-bbqcs\") pod \"etcd-operator-b45778765-t5bm5\" (UID: \"47b61e92-8bbd-40e0-96ee-b3bacd20950d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.086739 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05c5a8ba-5dcc-460d-830e-f55967fa0dbf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h6d62\" (UID: \"05c5a8ba-5dcc-460d-830e-f55967fa0dbf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.095883 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.106236 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4mp8\" (UniqueName: \"kubernetes.io/projected/f6a8f6cb-88cc-45a6-bc66-49e33d32851e-kube-api-access-r4mp8\") pod \"packageserver-d55dfcdfc-pgpkd\" (UID: \"f6a8f6cb-88cc-45a6-bc66-49e33d32851e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.112838 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.122716 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.125940 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.129214 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a19bd4f6-2bd3-404f-9282-89a12444562f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f6bql\" (UID: \"a19bd4f6-2bd3-404f-9282-89a12444562f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.145872 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.146415 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x"] Feb 28 03:37:57 crc kubenswrapper[4819]: E0228 03:37:57.146499 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.646467482 +0000 UTC m=+216.112036340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.148594 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-bound-sa-token\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.156984 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7nkqj"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.165866 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.171566 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.172761 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v4td8\" (UID: \"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.181508 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.188677 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwm4t\" (UniqueName: \"kubernetes.io/projected/3a12e5e4-9129-4a00-ae0c-684869c0cff7-kube-api-access-bwm4t\") pod \"collect-profiles-29537490-qt4j4\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.189801 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rtwrl"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.198285 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bpmrd"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.203945 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-77ljw"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.208980 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f94n4\" (UniqueName: \"kubernetes.io/projected/c8fc9078-b36f-49a3-b53c-67f60a904e8d-kube-api-access-f94n4\") pod \"service-ca-9c57cc56f-6j222\" (UID: \"c8fc9078-b36f-49a3-b53c-67f60a904e8d\") " pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.209836 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.224305 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzsvq\" (UniqueName: \"kubernetes.io/projected/0a08aebb-db7e-488c-b992-2286ba6c9fd0-kube-api-access-mzsvq\") pod \"marketplace-operator-79b997595-nb49n\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.245806 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.247238 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:57 crc kubenswrapper[4819]: E0228 03:37:57.247603 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.747591118 +0000 UTC m=+216.213159976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.252130 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.258058 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6j222" Feb 28 03:37:57 crc kubenswrapper[4819]: W0228 03:37:57.261957 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod304c4783_b723_4164_b000_8ae81986da3a.slice/crio-53ac441d22e2f568367cc91cdf29944d451e6945f5734432182b2f29a8af17a5 WatchSource:0}: Error finding container 53ac441d22e2f568367cc91cdf29944d451e6945f5734432182b2f29a8af17a5: Status 404 returned error can't find the container with id 53ac441d22e2f568367cc91cdf29944d451e6945f5734432182b2f29a8af17a5 Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.265709 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwqq9\" (UniqueName: \"kubernetes.io/projected/66425eb0-e8aa-44b7-b316-10949f9cc414-kube-api-access-gwqq9\") pod \"ingress-canary-c2zzz\" (UID: \"66425eb0-e8aa-44b7-b316-10949f9cc414\") " pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.281420 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.286354 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzcf\" (UniqueName: \"kubernetes.io/projected/85ff71ae-ba6c-4f29-9645-1eb02dc904f1-kube-api-access-qbzcf\") pod \"csi-hostpathplugin-b72rs\" (UID: \"85ff71ae-ba6c-4f29-9645-1eb02dc904f1\") " pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:57 crc kubenswrapper[4819]: W0228 03:37:57.287299 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7e72bf_07cf_4643_80da_6d11f847a61b.slice/crio-b016ae684c6241d991f01310311411b07ef28c206b89190e7d5e81b4726cbb33 WatchSource:0}: Error finding container b016ae684c6241d991f01310311411b07ef28c206b89190e7d5e81b4726cbb33: Status 404 returned error can't find the container with id b016ae684c6241d991f01310311411b07ef28c206b89190e7d5e81b4726cbb33 Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.308467 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x65gw\" (UniqueName: \"kubernetes.io/projected/b8f7cbee-14ae-4ae9-a139-8eea6fe271a3-kube-api-access-x65gw\") pod \"machine-config-server-2s2sf\" (UID: \"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3\") " pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.348594 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:57 crc kubenswrapper[4819]: E0228 03:37:57.348859 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.848844727 +0000 UTC m=+216.314413585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.378123 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.385951 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.403534 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.429432 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.451972 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:57 crc kubenswrapper[4819]: E0228 03:37:57.452292 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:57.95227624 +0000 UTC m=+216.417845098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.474397 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q8wxx"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.494922 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5"] Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.495874 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c2zzz" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.497480 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-trsfk" event={"ID":"56a73ad1-cc3f-445b-8e0a-d8ff6937ba57","Type":"ContainerStarted","Data":"94dc3518fa348e648d8fc4c913d359f77eb2a9a6fcb620012289005c9705cd7e"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.497504 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-trsfk" event={"ID":"56a73ad1-cc3f-445b-8e0a-d8ff6937ba57","Type":"ContainerStarted","Data":"c394f87d6830f9554f58fe7d6db33ebfe27281850bac149f6684896848d5541f"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.498161 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.501303 4819 patch_prober.go:28] interesting pod/downloads-7954f5f757-trsfk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.501339 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-trsfk" podUID="56a73ad1-cc3f-445b-8e0a-d8ff6937ba57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.503767 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.505447 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" event={"ID":"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a","Type":"ContainerStarted","Data":"090555b3b381d6779ff106f25377fc31efc1658c430d6e2c5b98a9d3fa2b37d2"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.507642 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" event={"ID":"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b","Type":"ContainerStarted","Data":"a1b2e045628c66be17c937ec6cbb62e1be2c6b52283864c71cb057179006f58b"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.509599 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" event={"ID":"b6a78496-9606-4297-b022-286a969e9ea6","Type":"ContainerStarted","Data":"e7a6b725a83221383e721f85af78de9bf9b3dd9b47429cd91b61b5d7b6ef9f01"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.512157 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" event={"ID":"f3590dc7-98a1-45cf-a420-f045d5d38335","Type":"ContainerStarted","Data":"2539baa888cd9b9f9a34f873624859bb0d7ba333ca3c6140efaa649ffd477209"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.512781 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" event={"ID":"0bba4110-e884-4369-8e1e-676a7afed536","Type":"ContainerStarted","Data":"57aaa8bebabaa91d4f9668254af989dd78484f7d061a53c3b0ac9715c65df913"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.514709 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" event={"ID":"506dcbc2-d163-4d73-874c-c5eb62d75dd7","Type":"ContainerStarted","Data":"436c0eccd3878eb7b9e64c8931a8702cbed2ec77c135c7eb0585f8749f047741"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.514729 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" event={"ID":"506dcbc2-d163-4d73-874c-c5eb62d75dd7","Type":"ContainerStarted","Data":"875e8a8d2e4e124cb4e407e36d2b2570c5d5e8764a9f226e768daaf672d9de74"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.517195 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" event={"ID":"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1","Type":"ContainerStarted","Data":"947423a8deb59df561c6150a452a907de8474e4b8e3058d144c28be61a5f8674"} Feb 28 03:37:57 crc kubenswrapper[4819]: I0228 03:37:57.527792 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2s2sf" Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.161419 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" event={"ID":"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625","Type":"ContainerStarted","Data":"cbdb86e31c511aafa662b65b1559a08cbfc5fa0560f73f6669d002d770fbab14"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.161489 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" event={"ID":"2e0fbd3f-cc84-4c5d-b4ba-116268fc0625","Type":"ContainerStarted","Data":"47013fcebc36bb3dab0292da7571051af5e0e9cce604898cc61e436da3c01dc2"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.165312 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.165599 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.165577233 +0000 UTC m=+217.631146091 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.186857 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" event={"ID":"ed7e72bf-07cf-4643-80da-6d11f847a61b","Type":"ContainerStarted","Data":"b016ae684c6241d991f01310311411b07ef28c206b89190e7d5e81b4726cbb33"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.194339 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t9qxt" event={"ID":"7a723a02-07f6-42e8-8317-b05eef10e3d8","Type":"ContainerStarted","Data":"55ac19021968e5775945901aaa99b927e58ab602f89e5319d5acdb7c94a04fe2"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.194431 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t9qxt" event={"ID":"7a723a02-07f6-42e8-8317-b05eef10e3d8","Type":"ContainerStarted","Data":"e93e7daa67b1b473bce18063568ff2c7c890857ac9144e4fadae41f3254d8b9d"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.198133 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" event={"ID":"257ec8d9-46b8-445b-883d-cd842a4b8b61","Type":"ContainerStarted","Data":"fb21b0de251f5a413d09bc649f5c318f8a3e46cfda0976e61289193a9aae11a7"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.198865 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-77ljw" event={"ID":"304c4783-b723-4164-b000-8ae81986da3a","Type":"ContainerStarted","Data":"53ac441d22e2f568367cc91cdf29944d451e6945f5734432182b2f29a8af17a5"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.199783 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" event={"ID":"b245dea2-5356-4952-9edf-11b68761e382","Type":"ContainerStarted","Data":"0e1ebc0a9070a10878b8db986e946616837434f1a1a311d147c289a4156b716f"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.199805 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" event={"ID":"b245dea2-5356-4952-9edf-11b68761e382","Type":"ContainerStarted","Data":"5080a4381a17d5793118fa0713281477d0dbdc3c22117e55fc515dc47b026c25"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.205299 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.235072 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" event={"ID":"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9","Type":"ContainerStarted","Data":"8065e3ff1a87b113e2215300160760c4f18339d1f10b83d2cf4edeb6f17518be"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.248586 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.273085 4819 generic.go:334] "Generic (PLEG): container finished" podID="ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8" containerID="6f8ff85c1b926fcfef24be9f8e2c73b7d619992d6323b6d2720a573a302e58d2" exitCode=0 Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.273218 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" event={"ID":"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8","Type":"ContainerDied","Data":"6f8ff85c1b926fcfef24be9f8e2c73b7d619992d6323b6d2720a573a302e58d2"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.273313 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" event={"ID":"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8","Type":"ContainerStarted","Data":"4e4f98e39ad0c0cfa9ba61e2c75935a351633702c19b30f7267e44a4af423f46"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.273654 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537496-dqglv"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.286381 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" event={"ID":"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38","Type":"ContainerStarted","Data":"840c55631cc4c08d901412773fe29ca69c8238b01545749410fe16c44284f32b"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.290035 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" event={"ID":"ed79a80c-ec3b-4446-9ad3-4e1906715cd7","Type":"ContainerStarted","Data":"abcdd2dca39812802595dcc5c183ead6946dd979994d492c5e8e334e423e26a4"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.295125 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.296029 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" event={"ID":"5e568a4f-33f8-447b-840c-dc560774878d","Type":"ContainerStarted","Data":"99056b95e61163a1ac5b4c5860d601e14567ff80f3246d1b3db64472d0326819"} Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.296081 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:58.796045759 +0000 UTC m=+217.261614617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.296082 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" event={"ID":"5e568a4f-33f8-447b-840c-dc560774878d","Type":"ContainerStarted","Data":"1012b95f74b2f6184dad15b63b524fef83ee042b0c64b456571dc6e6bec48c3e"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.316170 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.317796 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" event={"ID":"e72012e1-89cc-4b65-ab38-78f09ec59ea4","Type":"ContainerStarted","Data":"feeb56c215879fb47b1b04765177f875179b23bf2ded6562d90ce4e8d20db09e"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.325859 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" event={"ID":"1cb02855-da72-47a4-9456-3b3d9faf61a8","Type":"ContainerStarted","Data":"6b4a3f866a345fb492d8ef7f8e117e347e46f4bfb628d2369437481d9b2efdad"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.357512 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" event={"ID":"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e","Type":"ContainerStarted","Data":"541ab287d74d3ec1a41b7100e1270b510ee192f654491246d47500770f67bc3f"} Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.398422 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.399628 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:58.899609865 +0000 UTC m=+217.365178723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.455482 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.460025 4819 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.491669 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-t9qxt" podStartSLOduration=157.491653614 podStartE2EDuration="2m37.491653614s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:58.470737944 +0000 UTC m=+216.936306802" watchObservedRunningTime="2026-02-28 03:37:58.491653614 +0000 UTC m=+216.957222472" Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.505733 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.506061 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.006047402 +0000 UTC m=+217.471616260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.607068 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.607416 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.107404874 +0000 UTC m=+217.572973732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.612389 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.616698 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h49g8" podStartSLOduration=158.616679864 podStartE2EDuration="2m38.616679864s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:58.614779367 +0000 UTC m=+217.080348235" watchObservedRunningTime="2026-02-28 03:37:58.616679864 +0000 UTC m=+217.082248712" Feb 28 03:37:58 crc kubenswrapper[4819]: W0228 03:37:58.626919 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f7cbee_14ae_4ae9_a139_8eea6fe271a3.slice/crio-6324936fa886a8f2f19763bab7a907ea1336ae711b082f11ae748076e640e038 WatchSource:0}: Error finding container 6324936fa886a8f2f19763bab7a907ea1336ae711b082f11ae748076e640e038: Status 404 returned error can't find the container with id 6324936fa886a8f2f19763bab7a907ea1336ae711b082f11ae748076e640e038 Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.648526 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qv8xl" podStartSLOduration=157.648510556 podStartE2EDuration="2m37.648510556s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:58.646033345 +0000 UTC m=+217.111602223" watchObservedRunningTime="2026-02-28 03:37:58.648510556 +0000 UTC m=+217.114079414" Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.649693 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6j222"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.668014 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.683937 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:37:58 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:37:58 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:37:58 crc kubenswrapper[4819]: healthz check failed Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.684008 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.698032 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.700581 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.707638 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.708014 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.208000306 +0000 UTC m=+217.673569164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.741993 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-trsfk" podStartSLOduration=158.741974571 podStartE2EDuration="2m38.741974571s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:58.728940767 +0000 UTC m=+217.194509635" watchObservedRunningTime="2026-02-28 03:37:58.741974571 +0000 UTC m=+217.207543429" Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.742668 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.759486 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.761976 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t5bm5"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.809185 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.809552 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.309540802 +0000 UTC m=+217.775109660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.831018 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.910866 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:58 crc kubenswrapper[4819]: E0228 03:37:58.912852 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.412829281 +0000 UTC m=+217.878398139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.928098 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nb49n"] Feb 28 03:37:58 crc kubenswrapper[4819]: I0228 03:37:58.940065 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c2zzz"] Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.014283 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.014585 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.514571862 +0000 UTC m=+217.980140730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.037300 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8fc9078_b36f_49a3_b53c_67f60a904e8d.slice/crio-24a262dd841c2a83e22526578b34497415a1032df2a06fe9a9d960a98276a6d7 WatchSource:0}: Error finding container 24a262dd841c2a83e22526578b34497415a1032df2a06fe9a9d960a98276a6d7: Status 404 returned error can't find the container with id 24a262dd841c2a83e22526578b34497415a1032df2a06fe9a9d960a98276a6d7 Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.045695 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce43b92c_92e6_45b3_abf9_20db64e0ec05.slice/crio-730dc3bff3a6a29779ad43ae3d8ca78a0c4a8dd6df54eadd303b7a5d92f4b682 WatchSource:0}: Error finding container 730dc3bff3a6a29779ad43ae3d8ca78a0c4a8dd6df54eadd303b7a5d92f4b682: Status 404 returned error can't find the container with id 730dc3bff3a6a29779ad43ae3d8ca78a0c4a8dd6df54eadd303b7a5d92f4b682 Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.049916 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b61e92_8bbd_40e0_96ee_b3bacd20950d.slice/crio-0e5bac5320a136d149d1afc933166bd38306bcc0ccff260fa18f94a84c712071 WatchSource:0}: Error finding container 0e5bac5320a136d149d1afc933166bd38306bcc0ccff260fa18f94a84c712071: Status 404 returned error can't find the container with id 0e5bac5320a136d149d1afc933166bd38306bcc0ccff260fa18f94a84c712071 Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.052681 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda452d812_570e_4a9a_a473_c7bfa1daffe1.slice/crio-97edc180e9c43523d4b207b3dde79f4c107f6670f323841ba4bbba2ed5d83d2b WatchSource:0}: Error finding container 97edc180e9c43523d4b207b3dde79f4c107f6670f323841ba4bbba2ed5d83d2b: Status 404 returned error can't find the container with id 97edc180e9c43523d4b207b3dde79f4c107f6670f323841ba4bbba2ed5d83d2b Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.056939 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a38ecc3_8eb5_4916_b2ba_b0e64bb4722f.slice/crio-c9b897f52281c90f3e11dd3652cefdecbf9a9691fa29a6fc6ab42ab7603ebc7f WatchSource:0}: Error finding container c9b897f52281c90f3e11dd3652cefdecbf9a9691fa29a6fc6ab42ab7603ebc7f: Status 404 returned error can't find the container with id c9b897f52281c90f3e11dd3652cefdecbf9a9691fa29a6fc6ab42ab7603ebc7f Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.075967 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a08aebb_db7e_488c_b992_2286ba6c9fd0.slice/crio-aadc38e72b33d5a0a6160502cbdfa16994bdcad2504e525fc67a0f5f8227a7b3 WatchSource:0}: Error finding container aadc38e72b33d5a0a6160502cbdfa16994bdcad2504e525fc67a0f5f8227a7b3: Status 404 returned error can't find the container with id aadc38e72b33d5a0a6160502cbdfa16994bdcad2504e525fc67a0f5f8227a7b3 Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.077438 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a12e5e4_9129_4a00_ae0c_684869c0cff7.slice/crio-ae8cb9cc4375adb1d1edd391ffdd7fe94671e73f49efca5fea98e3a368205915 WatchSource:0}: Error finding container ae8cb9cc4375adb1d1edd391ffdd7fe94671e73f49efca5fea98e3a368205915: Status 404 returned error can't find the container with id ae8cb9cc4375adb1d1edd391ffdd7fe94671e73f49efca5fea98e3a368205915 Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.086681 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b72rs"] Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.101872 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ff71ae_ba6c_4f29_9645_1eb02dc904f1.slice/crio-13a8a59c355b53d2ec66427ef3b5f867f0fdbc3dd02620c7475d821678ecfe71 WatchSource:0}: Error finding container 13a8a59c355b53d2ec66427ef3b5f867f0fdbc3dd02620c7475d821678ecfe71: Status 404 returned error can't find the container with id 13a8a59c355b53d2ec66427ef3b5f867f0fdbc3dd02620c7475d821678ecfe71 Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.109543 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql"] Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.115097 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.116108 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.616079397 +0000 UTC m=+218.081648255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: W0228 03:37:59.133622 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19bd4f6_2bd3_404f_9282_89a12444562f.slice/crio-f62f2419443dbd6661be044729f768ccebdb3796249bce6664cf57d8641bf384 WatchSource:0}: Error finding container f62f2419443dbd6661be044729f768ccebdb3796249bce6664cf57d8641bf384: Status 404 returned error can't find the container with id f62f2419443dbd6661be044729f768ccebdb3796249bce6664cf57d8641bf384 Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.218900 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.219263 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.719237993 +0000 UTC m=+218.184806851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.320111 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.320499 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.820472381 +0000 UTC m=+218.286041249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.371765 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" event={"ID":"b245dea2-5356-4952-9edf-11b68761e382","Type":"ContainerStarted","Data":"f9e233fdeea75eca50a313d4f83ce4af04c43f49caebe97a0a1df5f96954fdaa"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.375477 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" event={"ID":"5e0c869d-a7dc-4145-af67-3eac7eb8f1a9","Type":"ContainerStarted","Data":"cc74431f31aaa2f4952223275008a50bfc057996c585dcb86238d479e1257891"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.382132 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" event={"ID":"506dcbc2-d163-4d73-874c-c5eb62d75dd7","Type":"ContainerStarted","Data":"75cb40b1e288022c5e624b81dc3a0093b2f29157a42dd0c65278649c92ae73fd"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.383373 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" event={"ID":"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4","Type":"ContainerStarted","Data":"8f3ef42c9aeab4b6e2ebfc1294d858720effb84c0b7eab883834259c0d6e5032"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.384476 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" event={"ID":"a19bd4f6-2bd3-404f-9282-89a12444562f","Type":"ContainerStarted","Data":"f62f2419443dbd6661be044729f768ccebdb3796249bce6664cf57d8641bf384"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.386056 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" event={"ID":"ce43b92c-92e6-45b3-abf9-20db64e0ec05","Type":"ContainerStarted","Data":"730dc3bff3a6a29779ad43ae3d8ca78a0c4a8dd6df54eadd303b7a5d92f4b682"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.387274 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-77ljw" event={"ID":"304c4783-b723-4164-b000-8ae81986da3a","Type":"ContainerStarted","Data":"52a1cf1b17e84f5fa9e7f3897d8100218bbfc0d0157a89a2e8c9e4555f749262"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.388321 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" event={"ID":"f6a8f6cb-88cc-45a6-bc66-49e33d32851e","Type":"ContainerStarted","Data":"08ca212656c46ac4ec3dcb9a65c9445ddb06160c3526bc0de0bcd4c5a9bb5808"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.389532 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" event={"ID":"f1d9d2c1-9ab4-46c6-b448-989fb8cfdc38","Type":"ContainerStarted","Data":"9dd04ed20cfe4736829c1af9b34a6d8a99695dc1b8d2967be1f5abaa49307b91"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.390589 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" event={"ID":"a452d812-570e-4a9a-a473-c7bfa1daffe1","Type":"ContainerStarted","Data":"97edc180e9c43523d4b207b3dde79f4c107f6670f323841ba4bbba2ed5d83d2b"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.394271 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" event={"ID":"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e","Type":"ContainerStarted","Data":"c5411853737b74e235b68625a1a17db140f9313f827fb3d497e36b83a7511168"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.394321 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.395744 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" event={"ID":"47b61e92-8bbd-40e0-96ee-b3bacd20950d","Type":"ContainerStarted","Data":"0e5bac5320a136d149d1afc933166bd38306bcc0ccff260fa18f94a84c712071"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.396754 4819 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-b4blq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.396788 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.398109 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" event={"ID":"e72012e1-89cc-4b65-ab38-78f09ec59ea4","Type":"ContainerStarted","Data":"9b47cf5271c1945a240a70e35cd22292017fbe118067e6477d012bc935a8f062"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.399898 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2s2sf" event={"ID":"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3","Type":"ContainerStarted","Data":"6324936fa886a8f2f19763bab7a907ea1336ae711b082f11ae748076e640e038"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.400885 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" event={"ID":"dea5b1f1-4b1d-4741-be96-ec4c56b3c3db","Type":"ContainerStarted","Data":"95e38b2a904f8fbd30573a8777c82b3bb26010ca8bf53f3ea91da88b9196a2a5"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.401936 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537496-dqglv" event={"ID":"23ec7136-9dc9-47c2-bf41-7b798e6bfe60","Type":"ContainerStarted","Data":"ae6f91a1192a99fb46bbf533abd1f976b5c7a4e526898438d763b6b1fb54b3b9"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.402952 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q8wxx" event={"ID":"f70ddc86-ef85-47d5-a9de-2c2a314ca39b","Type":"ContainerStarted","Data":"0b348714ba1f492a14f6480670516aadba77412045b40ca9a52340c9ce429ab5"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.404121 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" event={"ID":"c1530a3b-acb7-4a9a-bc2b-339db29fa05f","Type":"ContainerStarted","Data":"cf427cbcdc0bca4f0fe77091dcfde868a73afe863e1a572af52900e92b46e7de"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.406709 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" event={"ID":"f3590dc7-98a1-45cf-a420-f045d5d38335","Type":"ContainerStarted","Data":"da5d61713f53a5a3908715f6665dae231eeefcefe7344c1daf71ef4cf8a4d508"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.406869 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.407594 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c2zzz" event={"ID":"66425eb0-e8aa-44b7-b316-10949f9cc414","Type":"ContainerStarted","Data":"ee4d932bb796554b3a7f7ce772d30a94136c0bdf76ab8571e151f483ad252f35"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.415384 4819 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-w5fln container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.415430 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.415496 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" event={"ID":"0a08aebb-db7e-488c-b992-2286ba6c9fd0","Type":"ContainerStarted","Data":"aadc38e72b33d5a0a6160502cbdfa16994bdcad2504e525fc67a0f5f8227a7b3"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.421880 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.422240 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:37:59.922229692 +0000 UTC m=+218.387798550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.428149 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6j222" event={"ID":"c8fc9078-b36f-49a3-b53c-67f60a904e8d","Type":"ContainerStarted","Data":"24a262dd841c2a83e22526578b34497415a1032df2a06fe9a9d960a98276a6d7"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.428698 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qcfds" podStartSLOduration=159.428686352 podStartE2EDuration="2m39.428686352s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:59.427480602 +0000 UTC m=+217.893049470" watchObservedRunningTime="2026-02-28 03:37:59.428686352 +0000 UTC m=+217.894255210" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.428864 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" event={"ID":"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f","Type":"ContainerStarted","Data":"c9b897f52281c90f3e11dd3652cefdecbf9a9691fa29a6fc6ab42ab7603ebc7f"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.429451 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" podStartSLOduration=159.429445571 podStartE2EDuration="2m39.429445571s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:59.415574136 +0000 UTC m=+217.881142994" watchObservedRunningTime="2026-02-28 03:37:59.429445571 +0000 UTC m=+217.895014429" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.436493 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" event={"ID":"85ff71ae-ba6c-4f29-9645-1eb02dc904f1","Type":"ContainerStarted","Data":"13a8a59c355b53d2ec66427ef3b5f867f0fdbc3dd02620c7475d821678ecfe71"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.438412 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" event={"ID":"55ccad0a-8169-4f92-89e1-6b8db13f255b","Type":"ContainerStarted","Data":"03f325f485eb78b899df8fe5afeee822bddd0c37b66ae369d8e4f64211cef59c"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.444740 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" event={"ID":"ac6ce0f1-1683-417d-8a4b-aa5067e21b2b","Type":"ContainerStarted","Data":"9ec0856164f3bb41a1be577567cd6963434df93500129f6917f5241dfb2d3c79"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.449585 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" event={"ID":"0bba4110-e884-4369-8e1e-676a7afed536","Type":"ContainerStarted","Data":"911f73f32f4063c9853de0f220e708122460a3c8c9ef767e6fc50b582e277ed1"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.451061 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" event={"ID":"05c5a8ba-5dcc-460d-830e-f55967fa0dbf","Type":"ContainerStarted","Data":"170bae5e90dd0de913054b5d066d0c553de3962604f041d7b04523dddd89a765"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.453667 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" event={"ID":"e6c1e24e-7367-4030-bbe3-0c5a2595ba4a","Type":"ContainerStarted","Data":"5d16531e26374462429033f4373616c9240af313ef073c8521c6b6a42deea0a1"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.455187 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" event={"ID":"30d3c101-bb83-41b0-88b0-a09b6135d7d8","Type":"ContainerStarted","Data":"6ae6fc56e0e925b4ffc2a24d4be21b21b42abdaae551da91717a502123f9d06e"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.457010 4819 generic.go:334] "Generic (PLEG): container finished" podID="ed79a80c-ec3b-4446-9ad3-4e1906715cd7" containerID="ddb27b81c0f139c4405bc80a4daffa273276bebbea30716b4bc1b43e1b10f8fe" exitCode=0 Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.457058 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" event={"ID":"ed79a80c-ec3b-4446-9ad3-4e1906715cd7","Type":"ContainerDied","Data":"ddb27b81c0f139c4405bc80a4daffa273276bebbea30716b4bc1b43e1b10f8fe"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.469857 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" podStartSLOduration=159.469837786 podStartE2EDuration="2m39.469837786s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:59.448054204 +0000 UTC m=+217.913623072" watchObservedRunningTime="2026-02-28 03:37:59.469837786 +0000 UTC m=+217.935406644" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.471929 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" event={"ID":"1cb02855-da72-47a4-9456-3b3d9faf61a8","Type":"ContainerStarted","Data":"7b7ae9e7a07a04f08223aab2ca5b7dc2edb155ed2a515a480f3896f4efef888f"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.473911 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" event={"ID":"b6a78496-9606-4297-b022-286a969e9ea6","Type":"ContainerStarted","Data":"75f1aff2515d484aa033c0b5ec1b60d9f40aab1f384e9131756bf0b18e2d44e3"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.475496 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" event={"ID":"257ec8d9-46b8-445b-883d-cd842a4b8b61","Type":"ContainerStarted","Data":"b9744f1564298e5223aea252aa1f4403d97b5e288646626850fa6bfbb0ec040a"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.477452 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" event={"ID":"3a12e5e4-9129-4a00-ae0c-684869c0cff7","Type":"ContainerStarted","Data":"ae8cb9cc4375adb1d1edd391ffdd7fe94671e73f49efca5fea98e3a368205915"} Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.481058 4819 patch_prober.go:28] interesting pod/downloads-7954f5f757-trsfk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.481097 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-trsfk" podUID="56a73ad1-cc3f-445b-8e0a-d8ff6937ba57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.522925 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.524367 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.024350522 +0000 UTC m=+218.489919380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.624724 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.625177 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.12516378 +0000 UTC m=+218.590732638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.670945 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:37:59 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:37:59 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:37:59 crc kubenswrapper[4819]: healthz check failed Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.671323 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.725523 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.725711 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.22568494 +0000 UTC m=+218.691253808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.725809 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.726146 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.226136161 +0000 UTC m=+218.691705029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.826873 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.827221 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.327205456 +0000 UTC m=+218.792774314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:37:59 crc kubenswrapper[4819]: I0228 03:37:59.928462 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:37:59 crc kubenswrapper[4819]: E0228 03:37:59.928828 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.428812303 +0000 UTC m=+218.894381161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.033061 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.033319 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.533293422 +0000 UTC m=+218.998862280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.033737 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.034460 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.534443851 +0000 UTC m=+219.000012709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.128847 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pgp72" podStartSLOduration=160.128831509 podStartE2EDuration="2m40.128831509s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:37:59.464437552 +0000 UTC m=+217.930006420" watchObservedRunningTime="2026-02-28 03:38:00.128831509 +0000 UTC m=+218.594400367" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.129887 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537498-5h22n"] Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.130610 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.135506 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.135678 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.635655618 +0000 UTC m=+219.101224476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.135822 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.136159 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.63614262 +0000 UTC m=+219.101711478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.137256 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537498-5h22n"] Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.236846 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.237036 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.737003889 +0000 UTC m=+219.202572747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.237096 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whsnn\" (UniqueName: \"kubernetes.io/projected/500425cb-63aa-43e1-bc7b-2c11c88826c5-kube-api-access-whsnn\") pod \"auto-csr-approver-29537498-5h22n\" (UID: \"500425cb-63aa-43e1-bc7b-2c11c88826c5\") " pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.237270 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.237636 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.737622865 +0000 UTC m=+219.203191723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.338632 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.338899 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whsnn\" (UniqueName: \"kubernetes.io/projected/500425cb-63aa-43e1-bc7b-2c11c88826c5-kube-api-access-whsnn\") pod \"auto-csr-approver-29537498-5h22n\" (UID: \"500425cb-63aa-43e1-bc7b-2c11c88826c5\") " pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.339049 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.839017937 +0000 UTC m=+219.304586795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.369808 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whsnn\" (UniqueName: \"kubernetes.io/projected/500425cb-63aa-43e1-bc7b-2c11c88826c5-kube-api-access-whsnn\") pod \"auto-csr-approver-29537498-5h22n\" (UID: \"500425cb-63aa-43e1-bc7b-2c11c88826c5\") " pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.440105 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.440676 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:00.940655995 +0000 UTC m=+219.406224883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.444867 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.488333 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" event={"ID":"ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8","Type":"ContainerStarted","Data":"a89e17c6ea4fbe3da82159ad2b9c5712fb6a7f950981f5eae44c3f592aeeba1a"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.488535 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.492569 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" event={"ID":"a19bd4f6-2bd3-404f-9282-89a12444562f","Type":"ContainerStarted","Data":"e79f65ad823245b5a4e22004b0de91151ffea6a76f9f24b047c4876efed85eb8"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.494518 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" event={"ID":"ce43b92c-92e6-45b3-abf9-20db64e0ec05","Type":"ContainerStarted","Data":"5f3993d3cde5d0141d81bf580d48f1268d284dad4825ac4b6f7de7198f9beef3"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.495218 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.495920 4819 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mxlcb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.495949 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" podUID="ce43b92c-92e6-45b3-abf9-20db64e0ec05" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.501749 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" event={"ID":"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f","Type":"ContainerStarted","Data":"4a4a055174a115b8e3f762bb4a1231e15d59cdd357400971531f17a60825dce1"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.502658 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" event={"ID":"30d3c101-bb83-41b0-88b0-a09b6135d7d8","Type":"ContainerStarted","Data":"ef638701de9adba4b0de0db839f3687c5bbce1624e0166df1365bbc71f2525c8"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.505380 4819 generic.go:334] "Generic (PLEG): container finished" podID="5b1cfdf7-8bbd-4913-9786-2a71cf6baec1" containerID="81fc79eab8a81815033a6c03923a0254a2407f8854b263b2ebe555d8e99a40d9" exitCode=0 Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.505777 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" event={"ID":"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1","Type":"ContainerDied","Data":"81fc79eab8a81815033a6c03923a0254a2407f8854b263b2ebe555d8e99a40d9"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.510141 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" podStartSLOduration=160.510124683 podStartE2EDuration="2m40.510124683s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.509734774 +0000 UTC m=+218.975303632" watchObservedRunningTime="2026-02-28 03:38:00.510124683 +0000 UTC m=+218.975693531" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.515415 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" event={"ID":"f6a8f6cb-88cc-45a6-bc66-49e33d32851e","Type":"ContainerStarted","Data":"a292dbaa02839a3a9c858fc307f7065ddf2a1064e87e9bbcaf839d1cbecc2ec0"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.515775 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.516920 4819 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pgpkd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.516958 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" podUID="f6a8f6cb-88cc-45a6-bc66-49e33d32851e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.518016 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" event={"ID":"dea5b1f1-4b1d-4741-be96-ec4c56b3c3db","Type":"ContainerStarted","Data":"70b08633088d5bf12f9a74403eb9c54e3b5841798374b729dc11018efd4f3aa9"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.518056 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" event={"ID":"dea5b1f1-4b1d-4741-be96-ec4c56b3c3db","Type":"ContainerStarted","Data":"90118c1fc2160f6b32b1b98cebaa2dbaa4895c979cbe027f085e05308ffc8b3b"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.520093 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" event={"ID":"c1530a3b-acb7-4a9a-bc2b-339db29fa05f","Type":"ContainerStarted","Data":"146da0f5373b74d8470816da036770d9958a9832f83a4b6a4534390b0f7a9501"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.520847 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.521577 4819 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pg9r5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.521629 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" podUID="c1530a3b-acb7-4a9a-bc2b-339db29fa05f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.525618 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" event={"ID":"ed7e72bf-07cf-4643-80da-6d11f847a61b","Type":"ContainerStarted","Data":"665ab7abe89b17919d47915002a536a548e213087dd71dd7366350e572b4a289"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.525663 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" event={"ID":"ed7e72bf-07cf-4643-80da-6d11f847a61b","Type":"ContainerStarted","Data":"2760642f6ce1133f2efadd52e619e48833929518ce6efd73ceb1831805739a90"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.527752 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f6bql" podStartSLOduration=159.527737731 podStartE2EDuration="2m39.527737731s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.526715826 +0000 UTC m=+218.992284694" watchObservedRunningTime="2026-02-28 03:38:00.527737731 +0000 UTC m=+218.993306590" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.531330 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" event={"ID":"a452d812-570e-4a9a-a473-c7bfa1daffe1","Type":"ContainerStarted","Data":"4225ddc71859eb1cb45563c8af8d3c527d9eb27dae395b303e49dc76ea7bf91c"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.538698 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" event={"ID":"47b61e92-8bbd-40e0-96ee-b3bacd20950d","Type":"ContainerStarted","Data":"4424ff35ac41141dd412392f2ec4513a6e484cb51ead5c3f25a0c4a9a63b7f9e"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.541195 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" event={"ID":"3a12e5e4-9129-4a00-ae0c-684869c0cff7","Type":"ContainerStarted","Data":"390d55e3c1d4a275073a894872ecec77c6574d01473c8d88e4032a90bdd3e804"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.546945 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" event={"ID":"05c5a8ba-5dcc-460d-830e-f55967fa0dbf","Type":"ContainerStarted","Data":"5af8dd11c5a56ff43d8117f12d910d8550c2295b848d6128fbdab9ba771e40df"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.550166 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.550475 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" podStartSLOduration=159.550465347 podStartE2EDuration="2m39.550465347s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.544850807 +0000 UTC m=+219.010419665" watchObservedRunningTime="2026-02-28 03:38:00.550465347 +0000 UTC m=+219.016034205" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.552132 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.052116878 +0000 UTC m=+219.517685736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.556133 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.558275 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.058263711 +0000 UTC m=+219.523832619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.558761 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c2zzz" event={"ID":"66425eb0-e8aa-44b7-b316-10949f9cc414","Type":"ContainerStarted","Data":"fa5ea352cbe7833dbb5288d776c95d092dec3357250f25916bbd035075c779c0"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.568223 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q8wxx" event={"ID":"f70ddc86-ef85-47d5-a9de-2c2a314ca39b","Type":"ContainerStarted","Data":"7c2c91203caa78a282dc162e3189d9e5925df4994a9df778893ed9b75ca098a3"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.575698 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" event={"ID":"55ccad0a-8169-4f92-89e1-6b8db13f255b","Type":"ContainerStarted","Data":"153d624351b30384781564af346ea983aabc7c5fbe39a3aeb32602c21d9ad4e9"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.575738 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" event={"ID":"55ccad0a-8169-4f92-89e1-6b8db13f255b","Type":"ContainerStarted","Data":"3377e00b0f6ee0f8dfcbd0c0b8a37974c88bcf13a34df2328c5b3480acac3fe9"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.578158 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2s2sf" event={"ID":"b8f7cbee-14ae-4ae9-a139-8eea6fe271a3","Type":"ContainerStarted","Data":"88a1b51168ace373d9115a008f93c23c82b8951f20b6c39d024fd8bcf0643a52"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.582006 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" event={"ID":"b6a78496-9606-4297-b022-286a969e9ea6","Type":"ContainerStarted","Data":"dd857b9968b6732d02bb5db7a0d6bd2d144427f5b79b7e92e9c88b285cd5bec4"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.586009 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" event={"ID":"0bba4110-e884-4369-8e1e-676a7afed536","Type":"ContainerStarted","Data":"c8b559041c79ccb60fc2d4429749383453673b966d3998ddb250a5e1ab210d80"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.586618 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.595191 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" event={"ID":"0a08aebb-db7e-488c-b992-2286ba6c9fd0","Type":"ContainerStarted","Data":"332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.595623 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.602491 4819 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nb49n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.602547 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.616100 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" event={"ID":"ef03f8a2-9dde-4ee7-af68-2693ddf59fb4","Type":"ContainerStarted","Data":"cd39ecb42ca09fe9bc1b37aa480f159e948ac0ea34d0d9ebfd9bc46fe1f4ce3f"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.633038 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6j222" event={"ID":"c8fc9078-b36f-49a3-b53c-67f60a904e8d","Type":"ContainerStarted","Data":"e6e633a41877d7d7c4aadae0b88e7ba5b7e8107240bcb765af83543fef087250"} Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.633120 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.632224 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dc26n" podStartSLOduration=159.63220307 podStartE2EDuration="2m39.63220307s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.602712767 +0000 UTC m=+219.068281635" watchObservedRunningTime="2026-02-28 03:38:00.63220307 +0000 UTC m=+219.097771928" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.633460 4819 patch_prober.go:28] interesting pod/downloads-7954f5f757-trsfk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.633497 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-trsfk" podUID="56a73ad1-cc3f-445b-8e0a-d8ff6937ba57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.645357 4819 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9z6bd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.645437 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.646302 4819 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-b4blq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.646334 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.646359 4819 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-w5fln container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.646406 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.658768 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.671837 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.171807845 +0000 UTC m=+219.637376703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.673181 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:00 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:00 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:00 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.673225 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.678007 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.683660 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" podStartSLOduration=159.683632309 podStartE2EDuration="2m39.683632309s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.643517752 +0000 UTC m=+219.109086610" watchObservedRunningTime="2026-02-28 03:38:00.683632309 +0000 UTC m=+219.149201167" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.684219 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.184202984 +0000 UTC m=+219.649771842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.713206 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" podStartSLOduration=159.713178174 podStartE2EDuration="2m39.713178174s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.680608484 +0000 UTC m=+219.146177342" watchObservedRunningTime="2026-02-28 03:38:00.713178174 +0000 UTC m=+219.178747032" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.744068 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t5bm5" podStartSLOduration=160.744046322 podStartE2EDuration="2m40.744046322s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.699498334 +0000 UTC m=+219.165067202" watchObservedRunningTime="2026-02-28 03:38:00.744046322 +0000 UTC m=+219.209615180" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.745619 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" podStartSLOduration=160.745613531 podStartE2EDuration="2m40.745613531s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.735086419 +0000 UTC m=+219.200655277" watchObservedRunningTime="2026-02-28 03:38:00.745613531 +0000 UTC m=+219.211182389" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.768788 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2s2sf" podStartSLOduration=6.768769487 podStartE2EDuration="6.768769487s" podCreationTimestamp="2026-02-28 03:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.750544214 +0000 UTC m=+219.216113062" watchObservedRunningTime="2026-02-28 03:38:00.768769487 +0000 UTC m=+219.234338345" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.783822 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.785281 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.285257287 +0000 UTC m=+219.750826145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.791300 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c2zzz" podStartSLOduration=6.791287737 podStartE2EDuration="6.791287737s" podCreationTimestamp="2026-02-28 03:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.790310653 +0000 UTC m=+219.255879501" watchObservedRunningTime="2026-02-28 03:38:00.791287737 +0000 UTC m=+219.256856595" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.799709 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tdcjf" podStartSLOduration=159.799688296 podStartE2EDuration="2m39.799688296s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.769387353 +0000 UTC m=+219.234956211" watchObservedRunningTime="2026-02-28 03:38:00.799688296 +0000 UTC m=+219.265257154" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.818409 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h6d62" podStartSLOduration=160.818392882 podStartE2EDuration="2m40.818392882s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.815698935 +0000 UTC m=+219.281267793" watchObservedRunningTime="2026-02-28 03:38:00.818392882 +0000 UTC m=+219.283961740" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.861333 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtnpx" podStartSLOduration=159.861318829 podStartE2EDuration="2m39.861318829s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.859621247 +0000 UTC m=+219.325190105" watchObservedRunningTime="2026-02-28 03:38:00.861318829 +0000 UTC m=+219.326887687" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.886964 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.887290 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.387279405 +0000 UTC m=+219.852848263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.888884 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537498-5h22n"] Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.900007 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" podStartSLOduration=159.899988201 podStartE2EDuration="2m39.899988201s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.897684534 +0000 UTC m=+219.363253392" watchObservedRunningTime="2026-02-28 03:38:00.899988201 +0000 UTC m=+219.365557059" Feb 28 03:38:00 crc kubenswrapper[4819]: W0228 03:38:00.917587 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod500425cb_63aa_43e1_bc7b_2c11c88826c5.slice/crio-472faf0bfb7da0107e836214b705e09a0e1da2f1acc37474d5e93b0faa58d1f4 WatchSource:0}: Error finding container 472faf0bfb7da0107e836214b705e09a0e1da2f1acc37474d5e93b0faa58d1f4: Status 404 returned error can't find the container with id 472faf0bfb7da0107e836214b705e09a0e1da2f1acc37474d5e93b0faa58d1f4 Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.928478 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" podStartSLOduration=159.92846118 podStartE2EDuration="2m39.92846118s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.926649125 +0000 UTC m=+219.392217983" watchObservedRunningTime="2026-02-28 03:38:00.92846118 +0000 UTC m=+219.394030038" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.951527 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" podStartSLOduration=159.951503113 podStartE2EDuration="2m39.951503113s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.942870218 +0000 UTC m=+219.408439076" watchObservedRunningTime="2026-02-28 03:38:00.951503113 +0000 UTC m=+219.417071971" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.969194 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6j222" podStartSLOduration=159.969175772 podStartE2EDuration="2m39.969175772s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:00.964495816 +0000 UTC m=+219.430064674" watchObservedRunningTime="2026-02-28 03:38:00.969175772 +0000 UTC m=+219.434744630" Feb 28 03:38:00 crc kubenswrapper[4819]: I0228 03:38:00.988735 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:00 crc kubenswrapper[4819]: E0228 03:38:00.989112 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.489097768 +0000 UTC m=+219.954666626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.004563 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-szh85" podStartSLOduration=161.004545382 podStartE2EDuration="2m41.004545382s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.00245429 +0000 UTC m=+219.468023158" watchObservedRunningTime="2026-02-28 03:38:01.004545382 +0000 UTC m=+219.470114240" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.008570 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.022990 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75zxt" podStartSLOduration=160.022972701 podStartE2EDuration="2m40.022972701s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.021454263 +0000 UTC m=+219.487023121" watchObservedRunningTime="2026-02-28 03:38:01.022972701 +0000 UTC m=+219.488541559" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.050253 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7nkqj" podStartSLOduration=161.050228659 podStartE2EDuration="2m41.050228659s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.04788239 +0000 UTC m=+219.513451258" watchObservedRunningTime="2026-02-28 03:38:01.050228659 +0000 UTC m=+219.515797517" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.069162 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-77ljw" podStartSLOduration=161.069142689 podStartE2EDuration="2m41.069142689s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.067879768 +0000 UTC m=+219.533448626" watchObservedRunningTime="2026-02-28 03:38:01.069142689 +0000 UTC m=+219.534711567" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.092526 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7rkdb" podStartSLOduration=161.09250599 podStartE2EDuration="2m41.09250599s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.09009363 +0000 UTC m=+219.555662488" watchObservedRunningTime="2026-02-28 03:38:01.09250599 +0000 UTC m=+219.558074858" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.094405 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.095179 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.595163006 +0000 UTC m=+220.060731864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.148728 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" podStartSLOduration=161.148711619 podStartE2EDuration="2m41.148711619s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.123541662 +0000 UTC m=+219.589110510" watchObservedRunningTime="2026-02-28 03:38:01.148711619 +0000 UTC m=+219.614280477" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.167686 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w5n5t" podStartSLOduration=160.16767239 podStartE2EDuration="2m40.16767239s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.148184305 +0000 UTC m=+219.613753163" watchObservedRunningTime="2026-02-28 03:38:01.16767239 +0000 UTC m=+219.633241248" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.195816 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.196046 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.696004515 +0000 UTC m=+220.161573373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.196210 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.196506 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.696495617 +0000 UTC m=+220.162064475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.206695 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v629x" podStartSLOduration=160.20667668 podStartE2EDuration="2m40.20667668s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.168119151 +0000 UTC m=+219.633688019" watchObservedRunningTime="2026-02-28 03:38:01.20667668 +0000 UTC m=+219.672245538" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.297452 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.297641 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.797592812 +0000 UTC m=+220.263161670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.297814 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.298189 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.798179937 +0000 UTC m=+220.263748795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.337706 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36110: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.399483 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.399592 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.899569859 +0000 UTC m=+220.365138717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.399716 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.400012 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:01.900004689 +0000 UTC m=+220.365573547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.428291 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36124: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.500554 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.500963 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.00094986 +0000 UTC m=+220.466518718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.529128 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36128: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.548963 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36144: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.569505 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36150: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.602385 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.602753 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.102742723 +0000 UTC m=+220.568311581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.639733 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q8wxx" event={"ID":"f70ddc86-ef85-47d5-a9de-2c2a314ca39b","Type":"ContainerStarted","Data":"2b39cbd7efb27424f6364cd060a62c2211b780329db82da04b5a8d0ea7321f74"} Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.640807 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-q8wxx" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.647852 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" event={"ID":"a452d812-570e-4a9a-a473-c7bfa1daffe1","Type":"ContainerStarted","Data":"a1598495f8322cb9a90d240402e7aad186bb269115d0b348a6cc4b2d64cb9207"} Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.651661 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537498-5h22n" event={"ID":"500425cb-63aa-43e1-bc7b-2c11c88826c5","Type":"ContainerStarted","Data":"472faf0bfb7da0107e836214b705e09a0e1da2f1acc37474d5e93b0faa58d1f4"} Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.654471 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" event={"ID":"5b1cfdf7-8bbd-4913-9786-2a71cf6baec1","Type":"ContainerStarted","Data":"59b83d460fd586d2e176f9a3c54c207505252219cc8a481a97bd2edd2951ddd9"} Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.665634 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" event={"ID":"ed79a80c-ec3b-4446-9ad3-4e1906715cd7","Type":"ContainerStarted","Data":"b0776b4a4160351956a40c9d5d7039ccf41961c5ddb9c7043e00dea7a0b3129e"} Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.672591 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:01 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:01 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:01 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.672676 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.674552 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" event={"ID":"6a38ecc3-8eb5-4916-b2ba-b0e64bb4722f","Type":"ContainerStarted","Data":"de73d1e686450a6309802fb95b1cfa70b85fd50b9a1f3ad7d1faa2b884d1a109"} Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.680175 4819 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pg9r5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.680227 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" podUID="c1530a3b-acb7-4a9a-bc2b-339db29fa05f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.680306 4819 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pgpkd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.680362 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" podUID="f6a8f6cb-88cc-45a6-bc66-49e33d32851e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.680598 4819 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mxlcb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.680658 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" podUID="ce43b92c-92e6-45b3-abf9-20db64e0ec05" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682094 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-q8wxx" podStartSLOduration=8.682085226 podStartE2EDuration="8.682085226s" podCreationTimestamp="2026-02-28 03:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.658653733 +0000 UTC m=+220.124222591" watchObservedRunningTime="2026-02-28 03:38:01.682085226 +0000 UTC m=+220.147654074" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682122 4819 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9z6bd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682144 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682237 4819 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nb49n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682328 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682646 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36164: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.682706 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" podStartSLOduration=160.682702442 podStartE2EDuration="2m40.682702442s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.681950363 +0000 UTC m=+220.147519211" watchObservedRunningTime="2026-02-28 03:38:01.682702442 +0000 UTC m=+220.148271300" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.702236 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s2mhl" podStartSLOduration=160.702219447 podStartE2EDuration="2m40.702219447s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.699417807 +0000 UTC m=+220.164986665" watchObservedRunningTime="2026-02-28 03:38:01.702219447 +0000 UTC m=+220.167788305" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.703201 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.706829 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.206791551 +0000 UTC m=+220.672360409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.734188 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v4td8" podStartSLOduration=160.734171342 podStartE2EDuration="2m40.734171342s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.719699172 +0000 UTC m=+220.185268030" watchObservedRunningTime="2026-02-28 03:38:01.734171342 +0000 UTC m=+220.199740200" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.736971 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rtwrl" podStartSLOduration=160.736953241 podStartE2EDuration="2m40.736953241s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.73368027 +0000 UTC m=+220.199249128" watchObservedRunningTime="2026-02-28 03:38:01.736953241 +0000 UTC m=+220.202522099" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.750988 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b7z5m" podStartSLOduration=160.75097227 podStartE2EDuration="2m40.75097227s" podCreationTimestamp="2026-02-28 03:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:01.749220106 +0000 UTC m=+220.214788964" watchObservedRunningTime="2026-02-28 03:38:01.75097227 +0000 UTC m=+220.216541128" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.805774 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.807386 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.307371593 +0000 UTC m=+220.772940441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.882446 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36176: no serving certificate available for the kubelet" Feb 28 03:38:01 crc kubenswrapper[4819]: I0228 03:38:01.907675 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:01 crc kubenswrapper[4819]: E0228 03:38:01.907963 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.407948605 +0000 UTC m=+220.873517463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.009611 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.010042 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.510027824 +0000 UTC m=+220.975596682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.110311 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.110634 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.610615276 +0000 UTC m=+221.076184134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.212039 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.212409 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.712398188 +0000 UTC m=+221.177967046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.231510 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36182: no serving certificate available for the kubelet" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.266234 4819 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-bj4zw container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.266294 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" podUID="ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.266534 4819 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-bj4zw container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.266552 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" podUID="ccbe48a5-86b6-462e-a8e0-8bcfe64d57b8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.313003 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.313149 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.813116433 +0000 UTC m=+221.278685291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.313501 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.313789 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.81377835 +0000 UTC m=+221.279347208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.414929 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.415105 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.91508094 +0000 UTC m=+221.380649798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.415432 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.415730 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:02.915720106 +0000 UTC m=+221.381288964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.516985 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.517396 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.017378614 +0000 UTC m=+221.482947472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.618716 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.619015 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.119004532 +0000 UTC m=+221.584573390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.678075 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:02 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:02 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:02 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.678481 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.682375 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" event={"ID":"85ff71ae-ba6c-4f29-9645-1eb02dc904f1","Type":"ContainerStarted","Data":"fae16228d4aab4f5c25f85bbd5425a90ab3d19867f84bd633325d06b79a3a5f6"} Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.687408 4819 generic.go:334] "Generic (PLEG): container finished" podID="3a12e5e4-9129-4a00-ae0c-684869c0cff7" containerID="390d55e3c1d4a275073a894872ecec77c6574d01473c8d88e4032a90bdd3e804" exitCode=0 Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.687478 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" event={"ID":"3a12e5e4-9129-4a00-ae0c-684869c0cff7","Type":"ContainerDied","Data":"390d55e3c1d4a275073a894872ecec77c6574d01473c8d88e4032a90bdd3e804"} Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.698120 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" event={"ID":"ed79a80c-ec3b-4446-9ad3-4e1906715cd7","Type":"ContainerStarted","Data":"52c3486d778cba66b2e965f4e1a9c54437472bc15562c9a97996731d5230f497"} Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.700435 4819 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mxlcb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.700464 4819 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nb49n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.700467 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" podUID="ce43b92c-92e6-45b3-abf9-20db64e0ec05" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.700492 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.721137 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.721336 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.221310937 +0000 UTC m=+221.686879795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.721454 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.721787 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.221779719 +0000 UTC m=+221.687348577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.822602 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.823716 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.323700964 +0000 UTC m=+221.789269812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.894369 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36196: no serving certificate available for the kubelet" Feb 28 03:38:02 crc kubenswrapper[4819]: I0228 03:38:02.924490 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:02 crc kubenswrapper[4819]: E0228 03:38:02.924868 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.42485296 +0000 UTC m=+221.890421818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.025264 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.025409 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.52538043 +0000 UTC m=+221.990949288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.025500 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.025804 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.52579368 +0000 UTC m=+221.991362538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.126808 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.126960 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.626929706 +0000 UTC m=+222.092498574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.127362 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.127749 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.627738086 +0000 UTC m=+222.093306944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.228487 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.228701 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.728673887 +0000 UTC m=+222.194242745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.228802 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.229113 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.729102378 +0000 UTC m=+222.194671246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.298762 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" podStartSLOduration=163.29874626 podStartE2EDuration="2m43.29874626s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:02.744572986 +0000 UTC m=+221.210141854" watchObservedRunningTime="2026-02-28 03:38:03.29874626 +0000 UTC m=+221.764315118" Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.299921 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b4blq"] Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.300112 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerName="controller-manager" containerID="cri-o://c5411853737b74e235b68625a1a17db140f9313f827fb3d497e36b83a7511168" gracePeriod=30 Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.307137 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.331730 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.332204 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.832189962 +0000 UTC m=+222.297758820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.353894 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd"] Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.354094 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerName="route-controller-manager" containerID="cri-o://b9744f1564298e5223aea252aa1f4403d97b5e288646626850fa6bfbb0ec040a" gracePeriod=30 Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.362131 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.432918 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.433544 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:03.933532803 +0000 UTC m=+222.399101661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.437011 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b0d49a_8f6e_4dd8_8bda_6842443fab5e.slice/crio-c5411853737b74e235b68625a1a17db140f9313f827fb3d497e36b83a7511168.scope\": RecentStats: unable to find data in memory cache]" Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.535666 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.536052 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.036034533 +0000 UTC m=+222.501603391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.637075 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.637370 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.137359513 +0000 UTC m=+222.602928371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.671612 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:03 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:03 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:03 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.671664 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.737893 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.738223 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.238209042 +0000 UTC m=+222.703777900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.742870 4819 generic.go:334] "Generic (PLEG): container finished" podID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerID="c5411853737b74e235b68625a1a17db140f9313f827fb3d497e36b83a7511168" exitCode=0 Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.742937 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" event={"ID":"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e","Type":"ContainerDied","Data":"c5411853737b74e235b68625a1a17db140f9313f827fb3d497e36b83a7511168"} Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.753144 4819 generic.go:334] "Generic (PLEG): container finished" podID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerID="b9744f1564298e5223aea252aa1f4403d97b5e288646626850fa6bfbb0ec040a" exitCode=0 Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.753378 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" event={"ID":"257ec8d9-46b8-445b-883d-cd842a4b8b61","Type":"ContainerDied","Data":"b9744f1564298e5223aea252aa1f4403d97b5e288646626850fa6bfbb0ec040a"} Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.839334 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.840433 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.340420185 +0000 UTC m=+222.805989043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.943227 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.943501 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.443470458 +0000 UTC m=+222.909039316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:03 crc kubenswrapper[4819]: I0228 03:38:03.943899 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:03 crc kubenswrapper[4819]: E0228 03:38:03.944208 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.444195676 +0000 UTC m=+222.909764534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.045048 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.045207 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.545174998 +0000 UTC m=+223.010743856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.045373 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.045755 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.545737702 +0000 UTC m=+223.011306560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.080665 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.118596 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.127846 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.149379 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-serving-cert\") pod \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.149436 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-client-ca\") pod \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.149483 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-config\") pod \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.149512 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2jmg\" (UniqueName: \"kubernetes.io/projected/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-kube-api-access-w2jmg\") pod \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.149547 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-proxy-ca-bundles\") pod \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\" (UID: \"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.149643 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.149908 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.649894603 +0000 UTC m=+223.115463461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.154601 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-client-ca" (OuterVolumeSpecName: "client-ca") pod "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" (UID: "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.154696 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-config" (OuterVolumeSpecName: "config") pod "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" (UID: "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.155056 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" (UID: "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.157737 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" (UID: "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.160801 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-kube-api-access-w2jmg" (OuterVolumeSpecName: "kube-api-access-w2jmg") pod "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" (UID: "b9b0d49a-8f6e-4dd8-8bda-6842443fab5e"). InnerVolumeSpecName "kube-api-access-w2jmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.202561 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36208: no serving certificate available for the kubelet" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250187 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-client-ca\") pod \"257ec8d9-46b8-445b-883d-cd842a4b8b61\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250270 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a12e5e4-9129-4a00-ae0c-684869c0cff7-config-volume\") pod \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250292 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a12e5e4-9129-4a00-ae0c-684869c0cff7-secret-volume\") pod \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250333 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7cg6\" (UniqueName: \"kubernetes.io/projected/257ec8d9-46b8-445b-883d-cd842a4b8b61-kube-api-access-v7cg6\") pod \"257ec8d9-46b8-445b-883d-cd842a4b8b61\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250373 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwm4t\" (UniqueName: \"kubernetes.io/projected/3a12e5e4-9129-4a00-ae0c-684869c0cff7-kube-api-access-bwm4t\") pod \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\" (UID: \"3a12e5e4-9129-4a00-ae0c-684869c0cff7\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250411 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-config\") pod \"257ec8d9-46b8-445b-883d-cd842a4b8b61\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250434 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257ec8d9-46b8-445b-883d-cd842a4b8b61-serving-cert\") pod \"257ec8d9-46b8-445b-883d-cd842a4b8b61\" (UID: \"257ec8d9-46b8-445b-883d-cd842a4b8b61\") " Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250669 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250797 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250808 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2jmg\" (UniqueName: \"kubernetes.io/projected/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-kube-api-access-w2jmg\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250819 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250813 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-client-ca" (OuterVolumeSpecName: "client-ca") pod "257ec8d9-46b8-445b-883d-cd842a4b8b61" (UID: "257ec8d9-46b8-445b-883d-cd842a4b8b61"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250827 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.250878 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.251010 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a12e5e4-9129-4a00-ae0c-684869c0cff7-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a12e5e4-9129-4a00-ae0c-684869c0cff7" (UID: "3a12e5e4-9129-4a00-ae0c-684869c0cff7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.251060 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.751046919 +0000 UTC m=+223.216615867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.251482 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-config" (OuterVolumeSpecName: "config") pod "257ec8d9-46b8-445b-883d-cd842a4b8b61" (UID: "257ec8d9-46b8-445b-883d-cd842a4b8b61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.253808 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a12e5e4-9129-4a00-ae0c-684869c0cff7-kube-api-access-bwm4t" (OuterVolumeSpecName: "kube-api-access-bwm4t") pod "3a12e5e4-9129-4a00-ae0c-684869c0cff7" (UID: "3a12e5e4-9129-4a00-ae0c-684869c0cff7"). InnerVolumeSpecName "kube-api-access-bwm4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.254642 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/257ec8d9-46b8-445b-883d-cd842a4b8b61-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "257ec8d9-46b8-445b-883d-cd842a4b8b61" (UID: "257ec8d9-46b8-445b-883d-cd842a4b8b61"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.256127 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a12e5e4-9129-4a00-ae0c-684869c0cff7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a12e5e4-9129-4a00-ae0c-684869c0cff7" (UID: "3a12e5e4-9129-4a00-ae0c-684869c0cff7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.257352 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257ec8d9-46b8-445b-883d-cd842a4b8b61-kube-api-access-v7cg6" (OuterVolumeSpecName: "kube-api-access-v7cg6") pod "257ec8d9-46b8-445b-883d-cd842a4b8b61" (UID: "257ec8d9-46b8-445b-883d-cd842a4b8b61"). InnerVolumeSpecName "kube-api-access-v7cg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351426 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.351613 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.85158916 +0000 UTC m=+223.317158008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351774 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351875 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351888 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257ec8d9-46b8-445b-883d-cd842a4b8b61-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351897 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257ec8d9-46b8-445b-883d-cd842a4b8b61-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351905 4819 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a12e5e4-9129-4a00-ae0c-684869c0cff7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351913 4819 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a12e5e4-9129-4a00-ae0c-684869c0cff7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351922 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7cg6\" (UniqueName: \"kubernetes.io/projected/257ec8d9-46b8-445b-883d-cd842a4b8b61-kube-api-access-v7cg6\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.351930 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwm4t\" (UniqueName: \"kubernetes.io/projected/3a12e5e4-9129-4a00-ae0c-684869c0cff7-kube-api-access-bwm4t\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.352074 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.852063572 +0000 UTC m=+223.317632430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.452564 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.452998 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:04.952981192 +0000 UTC m=+223.418550050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.554137 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.554681 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.054649961 +0000 UTC m=+223.520218819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.655720 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.655905 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.155880999 +0000 UTC m=+223.621449857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.656062 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.656414 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.156405692 +0000 UTC m=+223.621974550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.673646 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:04 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:04 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:04 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.673723 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.757026 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.757386 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.257359474 +0000 UTC m=+223.722928322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.757980 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.758303 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.258295437 +0000 UTC m=+223.723864295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.761555 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" event={"ID":"b9b0d49a-8f6e-4dd8-8bda-6842443fab5e","Type":"ContainerDied","Data":"541ab287d74d3ec1a41b7100e1270b510ee192f654491246d47500770f67bc3f"} Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.761604 4819 scope.go:117] "RemoveContainer" containerID="c5411853737b74e235b68625a1a17db140f9313f827fb3d497e36b83a7511168" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.761621 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b4blq" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.764196 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" event={"ID":"3a12e5e4-9129-4a00-ae0c-684869c0cff7","Type":"ContainerDied","Data":"ae8cb9cc4375adb1d1edd391ffdd7fe94671e73f49efca5fea98e3a368205915"} Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.764222 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae8cb9cc4375adb1d1edd391ffdd7fe94671e73f49efca5fea98e3a368205915" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.764212 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537490-qt4j4" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.765940 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" event={"ID":"257ec8d9-46b8-445b-883d-cd842a4b8b61","Type":"ContainerDied","Data":"fb21b0de251f5a413d09bc649f5c318f8a3e46cfda0976e61289193a9aae11a7"} Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.766040 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.794732 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b4blq"] Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.796368 4819 scope.go:117] "RemoveContainer" containerID="b9744f1564298e5223aea252aa1f4403d97b5e288646626850fa6bfbb0ec040a" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.802450 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b4blq"] Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.809319 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd"] Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.812728 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9z6bd"] Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.859754 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.860116 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.360102469 +0000 UTC m=+223.825671327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.960753 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.961079 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.461063381 +0000 UTC m=+223.926632229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994390 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hnqz"] Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.994579 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerName="route-controller-manager" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994590 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerName="route-controller-manager" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.994607 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a12e5e4-9129-4a00-ae0c-684869c0cff7" containerName="collect-profiles" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994614 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a12e5e4-9129-4a00-ae0c-684869c0cff7" containerName="collect-profiles" Feb 28 03:38:04 crc kubenswrapper[4819]: E0228 03:38:04.994626 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerName="controller-manager" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994635 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerName="controller-manager" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994714 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a12e5e4-9129-4a00-ae0c-684869c0cff7" containerName="collect-profiles" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994725 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" containerName="route-controller-manager" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.994733 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" containerName="controller-manager" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.995358 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:04 crc kubenswrapper[4819]: I0228 03:38:04.998157 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.002361 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hnqz"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.041763 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c7b554d99-59qc7"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.045564 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.046617 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.047476 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.053190 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.053653 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.053842 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.054049 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.054411 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.054635 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.054856 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.055371 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.055367 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.055535 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.055528 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.055807 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.060384 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.061897 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.062380 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-utilities\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.062507 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.562483614 +0000 UTC m=+224.028052472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.062596 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-catalog-content\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.062745 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwwx8\" (UniqueName: \"kubernetes.io/projected/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-kube-api-access-wwwx8\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.073096 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c7b554d99-59qc7"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.090968 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164809 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4551cd47-079d-4f77-939b-a32ae73acca0-serving-cert\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164859 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnpx\" (UniqueName: \"kubernetes.io/projected/4551cd47-079d-4f77-939b-a32ae73acca0-kube-api-access-prnpx\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164887 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164919 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-utilities\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164951 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-config\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164971 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-config\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.164986 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-catalog-content\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165007 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-client-ca\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165048 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-proxy-ca-bundles\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165066 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/264e677e-e90f-4bfc-a32f-486d82cd63c4-serving-cert\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165082 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-client-ca\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165118 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l22\" (UniqueName: \"kubernetes.io/projected/264e677e-e90f-4bfc-a32f-486d82cd63c4-kube-api-access-p4l22\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165161 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwwx8\" (UniqueName: \"kubernetes.io/projected/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-kube-api-access-wwwx8\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.165294 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.665277121 +0000 UTC m=+224.130845979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165381 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-utilities\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.165657 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-catalog-content\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.187541 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwwx8\" (UniqueName: \"kubernetes.io/projected/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-kube-api-access-wwwx8\") pod \"community-operators-8hnqz\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.188657 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bkx4p"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.190211 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.192816 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.195854 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bkx4p"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.275981 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276169 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-config\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276196 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-config\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276220 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49m2p\" (UniqueName: \"kubernetes.io/projected/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-kube-api-access-49m2p\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276239 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-utilities\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276282 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-client-ca\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276320 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-proxy-ca-bundles\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276339 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/264e677e-e90f-4bfc-a32f-486d82cd63c4-serving-cert\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276357 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-client-ca\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276393 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l22\" (UniqueName: \"kubernetes.io/projected/264e677e-e90f-4bfc-a32f-486d82cd63c4-kube-api-access-p4l22\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276430 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-catalog-content\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276456 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4551cd47-079d-4f77-939b-a32ae73acca0-serving-cert\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.276475 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prnpx\" (UniqueName: \"kubernetes.io/projected/4551cd47-079d-4f77-939b-a32ae73acca0-kube-api-access-prnpx\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.277754 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-config\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.278066 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.778040116 +0000 UTC m=+224.243608974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.278303 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-client-ca\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.278841 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-client-ca\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.282736 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-config\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.284552 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bj4zw" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.286905 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/264e677e-e90f-4bfc-a32f-486d82cd63c4-serving-cert\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.288644 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4551cd47-079d-4f77-939b-a32ae73acca0-serving-cert\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.294808 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnpx\" (UniqueName: \"kubernetes.io/projected/4551cd47-079d-4f77-939b-a32ae73acca0-kube-api-access-prnpx\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.297751 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-proxy-ca-bundles\") pod \"controller-manager-5c7b554d99-59qc7\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.302864 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l22\" (UniqueName: \"kubernetes.io/projected/264e677e-e90f-4bfc-a32f-486d82cd63c4-kube-api-access-p4l22\") pod \"route-controller-manager-66cfdbc577-dbhm2\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.310699 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.370367 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.377924 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-catalog-content\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.377976 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.378013 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49m2p\" (UniqueName: \"kubernetes.io/projected/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-kube-api-access-49m2p\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.378031 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-utilities\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.378482 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-utilities\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.378515 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-catalog-content\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.379315 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.879298064 +0000 UTC m=+224.344866922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.387480 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g84jc"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.390337 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.391209 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.394125 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g84jc"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.404935 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49m2p\" (UniqueName: \"kubernetes.io/projected/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-kube-api-access-49m2p\") pod \"certified-operators-bkx4p\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.478820 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.479030 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.979003285 +0000 UTC m=+224.444572143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.479374 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-utilities\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.479449 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.479484 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-catalog-content\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.479514 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnc8w\" (UniqueName: \"kubernetes.io/projected/4421e7f5-138c-48da-9252-80593db19b91-kube-api-access-pnc8w\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.479766 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:05.979755003 +0000 UTC m=+224.445323861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.513941 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.580433 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.080417327 +0000 UTC m=+224.545986185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.580459 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.580638 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.580679 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-catalog-content\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.580714 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnc8w\" (UniqueName: \"kubernetes.io/projected/4421e7f5-138c-48da-9252-80593db19b91-kube-api-access-pnc8w\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.580744 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-utilities\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.581466 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-utilities\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.581675 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.081667718 +0000 UTC m=+224.547236576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.583542 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-catalog-content\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.586642 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8mg46"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.588004 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.598965 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mg46"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.600601 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnc8w\" (UniqueName: \"kubernetes.io/projected/4421e7f5-138c-48da-9252-80593db19b91-kube-api-access-pnc8w\") pod \"community-operators-g84jc\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.671023 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:05 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:05 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:05 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.671084 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.682116 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.682477 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-catalog-content\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.682521 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-utilities\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.682588 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2h5f\" (UniqueName: \"kubernetes.io/projected/bb2d247d-3cce-4daa-9a51-20769f987756-kube-api-access-b2h5f\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.682686 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.182673571 +0000 UTC m=+224.648242429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.708033 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.733185 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bkx4p"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.751842 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hnqz"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.754195 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.754804 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.756343 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.758890 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.759007 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.783568 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2h5f\" (UniqueName: \"kubernetes.io/projected/bb2d247d-3cce-4daa-9a51-20769f987756-kube-api-access-b2h5f\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.783641 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-catalog-content\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.783669 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-utilities\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.783720 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.783969 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.28395659 +0000 UTC m=+224.749525448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.784523 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkx4p" event={"ID":"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a","Type":"ContainerStarted","Data":"88616beb171cc11401e3cb0f87315c8133e576797eb28800b51ba4a6cd06fd71"} Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.785639 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerStarted","Data":"015578329e0e9e690669b26b6f2f63027e9135143e69092df9aa14013c6e69fa"} Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.831682 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.870964 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c7b554d99-59qc7"] Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.876700 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-catalog-content\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.876777 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-utilities\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.881525 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2h5f\" (UniqueName: \"kubernetes.io/projected/bb2d247d-3cce-4daa-9a51-20769f987756-kube-api-access-b2h5f\") pod \"certified-operators-8mg46\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.885572 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.885933 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.385907936 +0000 UTC m=+224.851476794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.886043 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7707a53-276f-4bd4-9818-36007238569e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.886118 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7707a53-276f-4bd4-9818-36007238569e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.886195 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.886535 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.386527502 +0000 UTC m=+224.852096360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.889938 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g84jc"] Feb 28 03:38:05 crc kubenswrapper[4819]: W0228 03:38:05.893415 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4551cd47_079d_4f77_939b_a32ae73acca0.slice/crio-509e67509e1f83b29db5cede0262185e9cf17401270234211315a51086c555ec WatchSource:0}: Error finding container 509e67509e1f83b29db5cede0262185e9cf17401270234211315a51086c555ec: Status 404 returned error can't find the container with id 509e67509e1f83b29db5cede0262185e9cf17401270234211315a51086c555ec Feb 28 03:38:05 crc kubenswrapper[4819]: W0228 03:38:05.895813 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod264e677e_e90f_4bfc_a32f_486d82cd63c4.slice/crio-a777eafa9246410a4e5ef0593fd8b31d9caa9175d559fcb5290ffd3ebeb7d740 WatchSource:0}: Error finding container a777eafa9246410a4e5ef0593fd8b31d9caa9175d559fcb5290ffd3ebeb7d740: Status 404 returned error can't find the container with id a777eafa9246410a4e5ef0593fd8b31d9caa9175d559fcb5290ffd3ebeb7d740 Feb 28 03:38:05 crc kubenswrapper[4819]: W0228 03:38:05.902411 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4421e7f5_138c_48da_9252_80593db19b91.slice/crio-946016bbbc55aa792f390c94887ca884a3843b31823b607e59a738d8be1baf80 WatchSource:0}: Error finding container 946016bbbc55aa792f390c94887ca884a3843b31823b607e59a738d8be1baf80: Status 404 returned error can't find the container with id 946016bbbc55aa792f390c94887ca884a3843b31823b607e59a738d8be1baf80 Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.911632 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.986839 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.987027 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.487011391 +0000 UTC m=+224.952580249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.987148 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7707a53-276f-4bd4-9818-36007238569e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.987191 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7707a53-276f-4bd4-9818-36007238569e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.987217 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:05 crc kubenswrapper[4819]: I0228 03:38:05.987498 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7707a53-276f-4bd4-9818-36007238569e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:05 crc kubenswrapper[4819]: E0228 03:38:05.987564 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.487557415 +0000 UTC m=+224.953126273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.003070 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7707a53-276f-4bd4-9818-36007238569e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.077505 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.088029 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.088299 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.588227519 +0000 UTC m=+225.053796377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.088505 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.088900 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.588875265 +0000 UTC m=+225.054444123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.190600 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.190744 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.690717619 +0000 UTC m=+225.156286477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.190883 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.191194 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.69118783 +0000 UTC m=+225.156756688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.264556 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.291585 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.291748 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.791723001 +0000 UTC m=+225.257291869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.292139 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.292355 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.792339737 +0000 UTC m=+225.257908595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.348295 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.366691 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mg46"] Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.382075 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257ec8d9-46b8-445b-883d-cd842a4b8b61" path="/var/lib/kubelet/pods/257ec8d9-46b8-445b-883d-cd842a4b8b61/volumes" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.383056 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b0d49a-8f6e-4dd8-8bda-6842443fab5e" path="/var/lib/kubelet/pods/b9b0d49a-8f6e-4dd8-8bda-6842443fab5e/volumes" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.392235 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.393302 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.393458 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.393580 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.893559644 +0000 UTC m=+225.359128502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.393769 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.394486 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.894470607 +0000 UTC m=+225.360039465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: W0228 03:38:06.442692 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb2d247d_3cce_4daa_9a51_20769f987756.slice/crio-7c9994d76d11029219146aae5c02a04a39ee45105bb4ec5d89b8b89937c33871 WatchSource:0}: Error finding container 7c9994d76d11029219146aae5c02a04a39ee45105bb4ec5d89b8b89937c33871: Status 404 returned error can't find the container with id 7c9994d76d11029219146aae5c02a04a39ee45105bb4ec5d89b8b89937c33871 Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.474784 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.474829 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.480778 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.494844 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.495016 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.994984687 +0000 UTC m=+225.460553545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.495382 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.496815 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:06.996802293 +0000 UTC m=+225.462371151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.506917 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.507496 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.509561 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.509846 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.519357 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.544921 4819 patch_prober.go:28] interesting pod/downloads-7954f5f757-trsfk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.544961 4819 patch_prober.go:28] interesting pod/downloads-7954f5f757-trsfk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.544969 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-trsfk" podUID="56a73ad1-cc3f-445b-8e0a-d8ff6937ba57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.545011 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-trsfk" podUID="56a73ad1-cc3f-445b-8e0a-d8ff6937ba57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.572515 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.586759 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bpmrd" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.596683 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.596805 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.096780938 +0000 UTC m=+225.562349796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.596920 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.597109 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.597187 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.597302 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.097286371 +0000 UTC m=+225.562855229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.606417 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.607266 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.618287 4819 patch_prober.go:28] interesting pod/console-f9d7485db-77ljw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.618359 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-77ljw" podUID="304c4783-b723-4164-b000-8ae81986da3a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.668470 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.671126 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:06 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:06 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:06 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.671162 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.698053 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.698361 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.698424 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.698728 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.198714434 +0000 UTC m=+225.664283292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.699999 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.723904 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.797538 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" event={"ID":"4551cd47-079d-4f77-939b-a32ae73acca0","Type":"ContainerStarted","Data":"b3660d3e01aef67b5920dd72e65c97bca09b9af0010ee40bb5bb03dcc2201811"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.797583 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" event={"ID":"4551cd47-079d-4f77-939b-a32ae73acca0","Type":"ContainerStarted","Data":"509e67509e1f83b29db5cede0262185e9cf17401270234211315a51086c555ec"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.797660 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.799804 4819 generic.go:334] "Generic (PLEG): container finished" podID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerID="2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895" exitCode=0 Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.799841 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkx4p" event={"ID":"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a","Type":"ContainerDied","Data":"2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.799972 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.800752 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.300738212 +0000 UTC m=+225.766307070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.807798 4819 generic.go:334] "Generic (PLEG): container finished" podID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerID="e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb" exitCode=0 Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.808121 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerDied","Data":"e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.811002 4819 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2tc9v container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]log ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]etcd ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/generic-apiserver-start-informers ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/max-in-flight-filter ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 28 03:38:06 crc kubenswrapper[4819]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 28 03:38:06 crc kubenswrapper[4819]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/project.openshift.io-projectcache ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 28 03:38:06 crc kubenswrapper[4819]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 28 03:38:06 crc kubenswrapper[4819]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 28 03:38:06 crc kubenswrapper[4819]: livez check failed Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.811050 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" podUID="ed79a80c-ec3b-4446-9ad3-4e1906715cd7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.814356 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" event={"ID":"264e677e-e90f-4bfc-a32f-486d82cd63c4","Type":"ContainerStarted","Data":"69eb3dbe97c4c675300eff8ba6cf2c4ce188157cb860f7f984b88d67e59b1cff"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.814381 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" event={"ID":"264e677e-e90f-4bfc-a32f-486d82cd63c4","Type":"ContainerStarted","Data":"a777eafa9246410a4e5ef0593fd8b31d9caa9175d559fcb5290ffd3ebeb7d740"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.814559 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.815439 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7707a53-276f-4bd4-9818-36007238569e","Type":"ContainerStarted","Data":"f78479f7e0b91acfba9005d6b7ac680c234e639ede6cb858f43357e5bb6e8623"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.817082 4819 generic.go:334] "Generic (PLEG): container finished" podID="4421e7f5-138c-48da-9252-80593db19b91" containerID="7086958c75ac1b143f72111390cf344485bd7fdb0a0746fffb54a72894dc2220" exitCode=0 Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.817122 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerDied","Data":"7086958c75ac1b143f72111390cf344485bd7fdb0a0746fffb54a72894dc2220"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.817143 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerStarted","Data":"946016bbbc55aa792f390c94887ca884a3843b31823b607e59a738d8be1baf80"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.828657 4819 ???:1] "http: TLS handshake error from 192.168.126.11:36220: no serving certificate available for the kubelet" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.850540 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.858590 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerStarted","Data":"7c9994d76d11029219146aae5c02a04a39ee45105bb4ec5d89b8b89937c33871"} Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.869536 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.870399 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gjpcj" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.901968 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:06 crc kubenswrapper[4819]: E0228 03:38:06.903221 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.403198691 +0000 UTC m=+225.868767549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.906888 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" podStartSLOduration=3.906855872 podStartE2EDuration="3.906855872s" podCreationTimestamp="2026-02-28 03:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:06.857096034 +0000 UTC m=+225.322664892" watchObservedRunningTime="2026-02-28 03:38:06.906855872 +0000 UTC m=+225.372424730" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.941876 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" podStartSLOduration=3.941858952 podStartE2EDuration="3.941858952s" podCreationTimestamp="2026-02-28 03:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:06.922829469 +0000 UTC m=+225.388398327" watchObservedRunningTime="2026-02-28 03:38:06.941858952 +0000 UTC m=+225.407427810" Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.990719 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgv4"] Feb 28 03:38:06 crc kubenswrapper[4819]: I0228 03:38:06.991666 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: W0228 03:38:06.999518 4819 reflector.go:561] object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb": failed to list *v1.Secret: secrets "redhat-marketplace-dockercfg-x2ctb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:06.999559 4819 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-x2ctb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"redhat-marketplace-dockercfg-x2ctb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.004633 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.014734 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.514719305 +0000 UTC m=+225.980288163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.025514 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgv4"] Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.090562 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mxlcb" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.105715 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.105929 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-utilities\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.105949 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8knm\" (UniqueName: \"kubernetes.io/projected/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-kube-api-access-l8knm\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.105997 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-catalog-content\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.106138 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.606124069 +0000 UTC m=+226.071692927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.179502 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg9r5" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.193976 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgpkd" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.209582 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-utilities\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.209616 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8knm\" (UniqueName: \"kubernetes.io/projected/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-kube-api-access-l8knm\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.209684 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-catalog-content\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.209780 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.210031 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.710019843 +0000 UTC m=+226.175588701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.210281 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-utilities\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.211003 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-catalog-content\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.251171 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8knm\" (UniqueName: \"kubernetes.io/projected/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-kube-api-access-l8knm\") pod \"redhat-marketplace-fxgv4\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.278068 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.313127 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.313986 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.813972759 +0000 UTC m=+226.279541617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.390743 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rnn2f"] Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.391700 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.405477 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnn2f"] Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.416229 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.416678 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:07.916664803 +0000 UTC m=+226.382233661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.438856 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.517117 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.517276 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.017240365 +0000 UTC m=+226.482809223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.517363 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-catalog-content\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.517399 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.517479 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4sll\" (UniqueName: \"kubernetes.io/projected/69f8a9ba-33dd-4c1b-8383-6a340bb82292-kube-api-access-r4sll\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.517503 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-utilities\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.518004 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.017997304 +0000 UTC m=+226.483566162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.618097 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.618239 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.118222447 +0000 UTC m=+226.583791305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.618307 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4sll\" (UniqueName: \"kubernetes.io/projected/69f8a9ba-33dd-4c1b-8383-6a340bb82292-kube-api-access-r4sll\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.618331 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-utilities\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.618398 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-catalog-content\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.618421 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.618685 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.118678278 +0000 UTC m=+226.584247136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.618894 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-utilities\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.619020 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-catalog-content\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.637707 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4sll\" (UniqueName: \"kubernetes.io/projected/69f8a9ba-33dd-4c1b-8383-6a340bb82292-kube-api-access-r4sll\") pod \"redhat-marketplace-rnn2f\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.670807 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:07 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:07 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:07 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.670856 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.719448 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.719816 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.219802914 +0000 UTC m=+226.685371772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.821278 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.821562 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.321550375 +0000 UTC m=+226.787119233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.840131 4819 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.873104 4819 generic.go:334] "Generic (PLEG): container finished" podID="bb2d247d-3cce-4daa-9a51-20769f987756" containerID="e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b" exitCode=0 Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.873192 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerDied","Data":"e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b"} Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.880003 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" event={"ID":"85ff71ae-ba6c-4f29-9645-1eb02dc904f1","Type":"ContainerStarted","Data":"d7a4227c83a5b5923c7d23053dabcdcb4b7b3ac114fb3d527f7f845438541cc6"} Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.882613 4819 generic.go:334] "Generic (PLEG): container finished" podID="a7707a53-276f-4bd4-9818-36007238569e" containerID="8703e3ccad4ab393fe82012ddff2cf5b1e4c69bc2baf4ba7e4e6d7fbb3772d30" exitCode=0 Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.883345 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7707a53-276f-4bd4-9818-36007238569e","Type":"ContainerDied","Data":"8703e3ccad4ab393fe82012ddff2cf5b1e4c69bc2baf4ba7e4e6d7fbb3772d30"} Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.922671 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.922886 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.422860375 +0000 UTC m=+226.888429233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:07 crc kubenswrapper[4819]: I0228 03:38:07.923025 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:07 crc kubenswrapper[4819]: E0228 03:38:07.923325 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.423310966 +0000 UTC m=+226.888879824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.024285 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:08 crc kubenswrapper[4819]: E0228 03:38:08.024502 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.524463312 +0000 UTC m=+226.990032170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.025506 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:08 crc kubenswrapper[4819]: E0228 03:38:08.026499 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.526483233 +0000 UTC m=+226.992052091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.128767 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:08 crc kubenswrapper[4819]: E0228 03:38:08.128948 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.628920261 +0000 UTC m=+227.094489119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.129073 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:08 crc kubenswrapper[4819]: E0228 03:38:08.129390 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.629378462 +0000 UTC m=+227.094947320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5w8xg" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.168001 4819 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-28T03:38:07.840149968Z","Handler":null,"Name":""} Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.230069 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:08 crc kubenswrapper[4819]: E0228 03:38:08.230526 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-28 03:38:08.730500558 +0000 UTC m=+227.196069416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.255815 4819 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.255853 4819 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.320605 4819 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/redhat-marketplace-fxgv4" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.320683 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.331600 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.331653 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.331700 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.331755 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.337990 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.342074 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.349286 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.374443 4819 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.374508 4819 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.380047 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.386647 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.391078 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.394270 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zhxpx"] Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.395318 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.397379 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.425396 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhxpx"] Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.433951 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.445362 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.446846 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5w8xg\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.494093 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.541090 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.541284 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-utilities\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.541327 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgh9m\" (UniqueName: \"kubernetes.io/projected/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-kube-api-access-qgh9m\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.541443 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-catalog-content\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.546289 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.630205 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.637346 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.642608 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-catalog-content\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.642655 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-utilities\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.642676 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgh9m\" (UniqueName: \"kubernetes.io/projected/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-kube-api-access-qgh9m\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.643085 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-catalog-content\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.643157 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-utilities\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.656750 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgh9m\" (UniqueName: \"kubernetes.io/projected/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-kube-api-access-qgh9m\") pod \"redhat-operators-zhxpx\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.669725 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:08 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:08 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:08 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.669773 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.746236 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.784844 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ggnf2"] Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.785870 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.797261 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggnf2"] Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.945830 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-catalog-content\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.945882 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf668\" (UniqueName: \"kubernetes.io/projected/2b7e1686-948f-45a4-b739-5e88a4489b8e-kube-api-access-mf668\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:08 crc kubenswrapper[4819]: I0228 03:38:08.946074 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-utilities\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.048007 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-utilities\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.048095 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-catalog-content\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.048138 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf668\" (UniqueName: \"kubernetes.io/projected/2b7e1686-948f-45a4-b739-5e88a4489b8e-kube-api-access-mf668\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.049475 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-utilities\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.050960 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-catalog-content\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.084357 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf668\" (UniqueName: \"kubernetes.io/projected/2b7e1686-948f-45a4-b739-5e88a4489b8e-kube-api-access-mf668\") pod \"redhat-operators-ggnf2\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.104514 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.669405 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:09 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:09 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:09 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:09 crc kubenswrapper[4819]: I0228 03:38:09.669752 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.379565 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.670598 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:10 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:10 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:10 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.670656 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.902852 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.903454 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a7707a53-276f-4bd4-9818-36007238569e","Type":"ContainerDied","Data":"f78479f7e0b91acfba9005d6b7ac680c234e639ede6cb858f43357e5bb6e8623"} Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.903505 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f78479f7e0b91acfba9005d6b7ac680c234e639ede6cb858f43357e5bb6e8623" Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.989183 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7707a53-276f-4bd4-9818-36007238569e-kube-api-access\") pod \"a7707a53-276f-4bd4-9818-36007238569e\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.989271 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7707a53-276f-4bd4-9818-36007238569e-kubelet-dir\") pod \"a7707a53-276f-4bd4-9818-36007238569e\" (UID: \"a7707a53-276f-4bd4-9818-36007238569e\") " Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.989489 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7707a53-276f-4bd4-9818-36007238569e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7707a53-276f-4bd4-9818-36007238569e" (UID: "a7707a53-276f-4bd4-9818-36007238569e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:38:10 crc kubenswrapper[4819]: I0228 03:38:10.990203 4819 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7707a53-276f-4bd4-9818-36007238569e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:10.993914 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7707a53-276f-4bd4-9818-36007238569e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7707a53-276f-4bd4-9818-36007238569e" (UID: "a7707a53-276f-4bd4-9818-36007238569e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.091951 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7707a53-276f-4bd4-9818-36007238569e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.404517 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.408169 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2tc9v" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.670679 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:11 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:11 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:11 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.670730 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.893717 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-q8wxx" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.912872 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 28 03:38:11 crc kubenswrapper[4819]: I0228 03:38:11.982182 4819 ???:1] "http: TLS handshake error from 192.168.126.11:57406: no serving certificate available for the kubelet" Feb 28 03:38:12 crc kubenswrapper[4819]: I0228 03:38:12.675670 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:12 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:12 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:12 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:12 crc kubenswrapper[4819]: I0228 03:38:12.675731 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:12 crc kubenswrapper[4819]: I0228 03:38:12.972375 4819 ???:1] "http: TLS handshake error from 192.168.126.11:57420: no serving certificate available for the kubelet" Feb 28 03:38:13 crc kubenswrapper[4819]: I0228 03:38:13.670351 4819 patch_prober.go:28] interesting pod/router-default-5444994796-t9qxt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 28 03:38:13 crc kubenswrapper[4819]: [-]has-synced failed: reason withheld Feb 28 03:38:13 crc kubenswrapper[4819]: [+]process-running ok Feb 28 03:38:13 crc kubenswrapper[4819]: healthz check failed Feb 28 03:38:13 crc kubenswrapper[4819]: I0228 03:38:13.670705 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t9qxt" podUID="7a723a02-07f6-42e8-8317-b05eef10e3d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:38:13 crc kubenswrapper[4819]: I0228 03:38:13.920650 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgv4"] Feb 28 03:38:13 crc kubenswrapper[4819]: I0228 03:38:13.932488 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnn2f"] Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.068631 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhxpx"] Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.184558 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.186738 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5w8xg"] Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.191439 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggnf2"] Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.671490 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.673903 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-t9qxt" Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.943546 4819 csr.go:261] certificate signing request csr-79579 is approved, waiting to be issued Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.947867 4819 csr.go:257] certificate signing request csr-79579 is issued Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.981214 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" event={"ID":"85ff71ae-ba6c-4f29-9645-1eb02dc904f1","Type":"ContainerStarted","Data":"96c79e867c9a24cce89f353a792715db50c14c189ac4cc5d28b7b06ddaf39b85"} Feb 28 03:38:14 crc kubenswrapper[4819]: I0228 03:38:14.984448 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537496-dqglv" event={"ID":"23ec7136-9dc9-47c2-bf41-7b798e6bfe60","Type":"ContainerStarted","Data":"376bf5ca05bb87538fadf4786ccdaa2318653d06cdb4d4265201087d9b9385fd"} Feb 28 03:38:15 crc kubenswrapper[4819]: I0228 03:38:15.002611 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537496-dqglv" podStartSLOduration=119.918721873 podStartE2EDuration="2m15.002596504s" podCreationTimestamp="2026-02-28 03:36:00 +0000 UTC" firstStartedPulling="2026-02-28 03:37:58.459956016 +0000 UTC m=+216.925524874" lastFinishedPulling="2026-02-28 03:38:13.543830647 +0000 UTC m=+232.009399505" observedRunningTime="2026-02-28 03:38:15.002540992 +0000 UTC m=+233.468109850" watchObservedRunningTime="2026-02-28 03:38:15.002596504 +0000 UTC m=+233.468165352" Feb 28 03:38:15 crc kubenswrapper[4819]: I0228 03:38:15.953475 4819 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-12 12:11:01.74812869 +0000 UTC Feb 28 03:38:15 crc kubenswrapper[4819]: I0228 03:38:15.953536 4819 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6176h32m45.794623762s for next certificate rotation Feb 28 03:38:15 crc kubenswrapper[4819]: I0228 03:38:15.994373 4819 generic.go:334] "Generic (PLEG): container finished" podID="23ec7136-9dc9-47c2-bf41-7b798e6bfe60" containerID="376bf5ca05bb87538fadf4786ccdaa2318653d06cdb4d4265201087d9b9385fd" exitCode=0 Feb 28 03:38:15 crc kubenswrapper[4819]: I0228 03:38:15.994422 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537496-dqglv" event={"ID":"23ec7136-9dc9-47c2-bf41-7b798e6bfe60","Type":"ContainerDied","Data":"376bf5ca05bb87538fadf4786ccdaa2318653d06cdb4d4265201087d9b9385fd"} Feb 28 03:38:16 crc kubenswrapper[4819]: I0228 03:38:16.563528 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-trsfk" Feb 28 03:38:16 crc kubenswrapper[4819]: I0228 03:38:16.607088 4819 patch_prober.go:28] interesting pod/console-f9d7485db-77ljw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 28 03:38:16 crc kubenswrapper[4819]: I0228 03:38:16.607154 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-77ljw" podUID="304c4783-b723-4164-b000-8ae81986da3a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 28 03:38:16 crc kubenswrapper[4819]: I0228 03:38:16.955356 4819 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 08:05:18.96693196 +0000 UTC Feb 28 03:38:16 crc kubenswrapper[4819]: I0228 03:38:16.955402 4819 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7372h27m2.01153287s for next certificate rotation Feb 28 03:38:19 crc kubenswrapper[4819]: W0228 03:38:19.535936 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2ea996f7b80c360b960e686f33cab755128969887a00b03b39dc4a5c39e9a898 WatchSource:0}: Error finding container 2ea996f7b80c360b960e686f33cab755128969887a00b03b39dc4a5c39e9a898: Status 404 returned error can't find the container with id 2ea996f7b80c360b960e686f33cab755128969887a00b03b39dc4a5c39e9a898 Feb 28 03:38:19 crc kubenswrapper[4819]: I0228 03:38:19.551313 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:38:19 crc kubenswrapper[4819]: I0228 03:38:19.560041 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7eede0c-6dc0-48ac-8065-7e0d9ed91212-metrics-certs\") pod \"network-metrics-daemon-lbrtr\" (UID: \"e7eede0c-6dc0-48ac-8065-7e0d9ed91212\") " pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:38:19 crc kubenswrapper[4819]: I0228 03:38:19.743328 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lbrtr" Feb 28 03:38:20 crc kubenswrapper[4819]: I0228 03:38:20.016302 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2ea996f7b80c360b960e686f33cab755128969887a00b03b39dc4a5c39e9a898"} Feb 28 03:38:20 crc kubenswrapper[4819]: W0228 03:38:20.748834 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553c6c36_5977_4ecd_a7b4_9ffb9a095d3f.slice/crio-432a0549be230e82b3b49e5950ead1df74092d8829d489dff3fbaeba732afda1 WatchSource:0}: Error finding container 432a0549be230e82b3b49e5950ead1df74092d8829d489dff3fbaeba732afda1: Status 404 returned error can't find the container with id 432a0549be230e82b3b49e5950ead1df74092d8829d489dff3fbaeba732afda1 Feb 28 03:38:20 crc kubenswrapper[4819]: W0228 03:38:20.750844 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8758ef_cad6_4c28_bc87_bea1b0c7d2b8.slice/crio-26be1f4c2a5789c5be9e8df543e47179ac83da44b0ec19c4bc9c74fd395c74c8 WatchSource:0}: Error finding container 26be1f4c2a5789c5be9e8df543e47179ac83da44b0ec19c4bc9c74fd395c74c8: Status 404 returned error can't find the container with id 26be1f4c2a5789c5be9e8df543e47179ac83da44b0ec19c4bc9c74fd395c74c8 Feb 28 03:38:20 crc kubenswrapper[4819]: W0228 03:38:20.756420 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f8a9ba_33dd_4c1b_8383_6a340bb82292.slice/crio-dd61951b140ab3d8df38c00f66230ce8f59b2625b31437df379c22b7a850cc13 WatchSource:0}: Error finding container dd61951b140ab3d8df38c00f66230ce8f59b2625b31437df379c22b7a850cc13: Status 404 returned error can't find the container with id dd61951b140ab3d8df38c00f66230ce8f59b2625b31437df379c22b7a850cc13 Feb 28 03:38:20 crc kubenswrapper[4819]: W0228 03:38:20.777640 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3891fbd_1ffc_4ab9_8d00_0871e7cabfdd.slice/crio-33abf535c0627c94e3f9f33f43e9d550a98c2928086936a9e62502185b6e599e WatchSource:0}: Error finding container 33abf535c0627c94e3f9f33f43e9d550a98c2928086936a9e62502185b6e599e: Status 404 returned error can't find the container with id 33abf535c0627c94e3f9f33f43e9d550a98c2928086936a9e62502185b6e599e Feb 28 03:38:20 crc kubenswrapper[4819]: W0228 03:38:20.780063 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b7e1686_948f_45a4_b739_5e88a4489b8e.slice/crio-2277b19318a132c806884b99604e2560c8bcc948eeaa24faeeb676850b0ec575 WatchSource:0}: Error finding container 2277b19318a132c806884b99604e2560c8bcc948eeaa24faeeb676850b0ec575: Status 404 returned error can't find the container with id 2277b19318a132c806884b99604e2560c8bcc948eeaa24faeeb676850b0ec575 Feb 28 03:38:20 crc kubenswrapper[4819]: I0228 03:38:20.830591 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:38:20 crc kubenswrapper[4819]: I0228 03:38:20.871905 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w82dc\" (UniqueName: \"kubernetes.io/projected/23ec7136-9dc9-47c2-bf41-7b798e6bfe60-kube-api-access-w82dc\") pod \"23ec7136-9dc9-47c2-bf41-7b798e6bfe60\" (UID: \"23ec7136-9dc9-47c2-bf41-7b798e6bfe60\") " Feb 28 03:38:20 crc kubenswrapper[4819]: I0228 03:38:20.880837 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ec7136-9dc9-47c2-bf41-7b798e6bfe60-kube-api-access-w82dc" (OuterVolumeSpecName: "kube-api-access-w82dc") pod "23ec7136-9dc9-47c2-bf41-7b798e6bfe60" (UID: "23ec7136-9dc9-47c2-bf41-7b798e6bfe60"). InnerVolumeSpecName "kube-api-access-w82dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:20 crc kubenswrapper[4819]: I0228 03:38:20.975801 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w82dc\" (UniqueName: \"kubernetes.io/projected/23ec7136-9dc9-47c2-bf41-7b798e6bfe60-kube-api-access-w82dc\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.034285 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd","Type":"ContainerStarted","Data":"33abf535c0627c94e3f9f33f43e9d550a98c2928086936a9e62502185b6e599e"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.037665 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgv4" event={"ID":"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8","Type":"ContainerStarted","Data":"26be1f4c2a5789c5be9e8df543e47179ac83da44b0ec19c4bc9c74fd395c74c8"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.039209 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7bd2bb83cac668885261c27db5147f9383e6b9506dbb4e333b1992d9345dce02"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.041620 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537496-dqglv" Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.041625 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537496-dqglv" event={"ID":"23ec7136-9dc9-47c2-bf41-7b798e6bfe60","Type":"ContainerDied","Data":"ae6f91a1192a99fb46bbf533abd1f976b5c7a4e526898438d763b6b1fb54b3b9"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.041680 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae6f91a1192a99fb46bbf533abd1f976b5c7a4e526898438d763b6b1fb54b3b9" Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.044042 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnn2f" event={"ID":"69f8a9ba-33dd-4c1b-8383-6a340bb82292","Type":"ContainerStarted","Data":"dd61951b140ab3d8df38c00f66230ce8f59b2625b31437df379c22b7a850cc13"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.045231 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerStarted","Data":"5aa9e141a13e996172a84d11aeccb114e93619f9d2b24b005805a73d0720f888"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.046263 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerStarted","Data":"2277b19318a132c806884b99604e2560c8bcc948eeaa24faeeb676850b0ec575"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.048845 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" event={"ID":"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f","Type":"ContainerStarted","Data":"432a0549be230e82b3b49e5950ead1df74092d8829d489dff3fbaeba732afda1"} Feb 28 03:38:21 crc kubenswrapper[4819]: I0228 03:38:21.050340 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cadd0cff6a2af7341680b02fd639e33c022e40e4cb3f3a4ea7e9c8be8d2aab3d"} Feb 28 03:38:23 crc kubenswrapper[4819]: I0228 03:38:23.105461 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2"] Feb 28 03:38:23 crc kubenswrapper[4819]: I0228 03:38:23.106023 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerName="route-controller-manager" containerID="cri-o://69eb3dbe97c4c675300eff8ba6cf2c4ce188157cb860f7f984b88d67e59b1cff" gracePeriod=30 Feb 28 03:38:23 crc kubenswrapper[4819]: I0228 03:38:23.112703 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c7b554d99-59qc7"] Feb 28 03:38:23 crc kubenswrapper[4819]: I0228 03:38:23.113164 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" containerName="controller-manager" containerID="cri-o://b3660d3e01aef67b5920dd72e65c97bca09b9af0010ee40bb5bb03dcc2201811" gracePeriod=30 Feb 28 03:38:24 crc kubenswrapper[4819]: I0228 03:38:24.068370 4819 generic.go:334] "Generic (PLEG): container finished" podID="4551cd47-079d-4f77-939b-a32ae73acca0" containerID="b3660d3e01aef67b5920dd72e65c97bca09b9af0010ee40bb5bb03dcc2201811" exitCode=0 Feb 28 03:38:24 crc kubenswrapper[4819]: I0228 03:38:24.068480 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" event={"ID":"4551cd47-079d-4f77-939b-a32ae73acca0","Type":"ContainerDied","Data":"b3660d3e01aef67b5920dd72e65c97bca09b9af0010ee40bb5bb03dcc2201811"} Feb 28 03:38:24 crc kubenswrapper[4819]: I0228 03:38:24.070590 4819 generic.go:334] "Generic (PLEG): container finished" podID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerID="69eb3dbe97c4c675300eff8ba6cf2c4ce188157cb860f7f984b88d67e59b1cff" exitCode=0 Feb 28 03:38:24 crc kubenswrapper[4819]: I0228 03:38:24.070620 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" event={"ID":"264e677e-e90f-4bfc-a32f-486d82cd63c4","Type":"ContainerDied","Data":"69eb3dbe97c4c675300eff8ba6cf2c4ce188157cb860f7f984b88d67e59b1cff"} Feb 28 03:38:25 crc kubenswrapper[4819]: I0228 03:38:25.371978 4819 patch_prober.go:28] interesting pod/controller-manager-5c7b554d99-59qc7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Feb 28 03:38:25 crc kubenswrapper[4819]: I0228 03:38:25.372598 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Feb 28 03:38:26 crc kubenswrapper[4819]: I0228 03:38:26.413338 4819 patch_prober.go:28] interesting pod/route-controller-manager-66cfdbc577-dbhm2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": context deadline exceeded" start-of-body= Feb 28 03:38:26 crc kubenswrapper[4819]: I0228 03:38:26.413411 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": context deadline exceeded" Feb 28 03:38:26 crc kubenswrapper[4819]: I0228 03:38:26.611232 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:38:26 crc kubenswrapper[4819]: I0228 03:38:26.614967 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-77ljw" Feb 28 03:38:30 crc kubenswrapper[4819]: I0228 03:38:30.834045 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:38:30 crc kubenswrapper[4819]: I0228 03:38:30.834503 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:38:32 crc kubenswrapper[4819]: E0228 03:38:32.276227 4819 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 28 03:38:32 crc kubenswrapper[4819]: E0228 03:38:32.276491 4819 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwwx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8hnqz_openshift-marketplace(2b9aa27e-76e7-4507-bca7-0ee08ff3a968): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 03:38:32 crc kubenswrapper[4819]: E0228 03:38:32.278485 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8hnqz" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" Feb 28 03:38:36 crc kubenswrapper[4819]: E0228 03:38:36.348662 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8hnqz" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" Feb 28 03:38:36 crc kubenswrapper[4819]: I0228 03:38:36.371466 4819 patch_prober.go:28] interesting pod/controller-manager-5c7b554d99-59qc7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:38:36 crc kubenswrapper[4819]: I0228 03:38:36.371545 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:38:36 crc kubenswrapper[4819]: I0228 03:38:36.393463 4819 patch_prober.go:28] interesting pod/route-controller-manager-66cfdbc577-dbhm2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 28 03:38:36 crc kubenswrapper[4819]: I0228 03:38:36.393529 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 28 03:38:36 crc kubenswrapper[4819]: I0228 03:38:36.841800 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5pnmf" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.917209 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 03:38:40 crc kubenswrapper[4819]: E0228 03:38:40.917703 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7707a53-276f-4bd4-9818-36007238569e" containerName="pruner" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.917728 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7707a53-276f-4bd4-9818-36007238569e" containerName="pruner" Feb 28 03:38:40 crc kubenswrapper[4819]: E0228 03:38:40.917752 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ec7136-9dc9-47c2-bf41-7b798e6bfe60" containerName="oc" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.917766 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ec7136-9dc9-47c2-bf41-7b798e6bfe60" containerName="oc" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.917944 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ec7136-9dc9-47c2-bf41-7b798e6bfe60" containerName="oc" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.917988 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7707a53-276f-4bd4-9818-36007238569e" containerName="pruner" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.919106 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:40 crc kubenswrapper[4819]: I0228 03:38:40.948572 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.081756 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/004f5681-552d-4d79-abeb-10f48d02f407-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.081880 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/004f5681-552d-4d79-abeb-10f48d02f407-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.184066 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/004f5681-552d-4d79-abeb-10f48d02f407-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.184173 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/004f5681-552d-4d79-abeb-10f48d02f407-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.184302 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/004f5681-552d-4d79-abeb-10f48d02f407-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.217434 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/004f5681-552d-4d79-abeb-10f48d02f407-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:41 crc kubenswrapper[4819]: I0228 03:38:41.252003 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:42 crc kubenswrapper[4819]: E0228 03:38:42.244068 4819 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 28 03:38:42 crc kubenswrapper[4819]: E0228 03:38:42.244333 4819 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-49m2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bkx4p_openshift-marketplace(9fe9f0aa-6448-48d7-900d-a8d5646a1a6a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 03:38:42 crc kubenswrapper[4819]: E0228 03:38:42.246480 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bkx4p" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" Feb 28 03:38:44 crc kubenswrapper[4819]: E0228 03:38:44.457587 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bkx4p" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.543467 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.548554 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.587076 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9"] Feb 28 03:38:44 crc kubenswrapper[4819]: E0228 03:38:44.587762 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerName="route-controller-manager" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.587793 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerName="route-controller-manager" Feb 28 03:38:44 crc kubenswrapper[4819]: E0228 03:38:44.587825 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" containerName="controller-manager" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.587837 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" containerName="controller-manager" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.588015 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" containerName="controller-manager" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.588048 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" containerName="route-controller-manager" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.588692 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: E0228 03:38:44.591470 4819 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 28 03:38:44 crc kubenswrapper[4819]: E0228 03:38:44.591635 4819 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b2h5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8mg46_openshift-marketplace(bb2d247d-3cce-4daa-9a51-20769f987756): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 28 03:38:44 crc kubenswrapper[4819]: E0228 03:38:44.593081 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8mg46" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.599588 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9"] Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.737812 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/264e677e-e90f-4bfc-a32f-486d82cd63c4-serving-cert\") pod \"264e677e-e90f-4bfc-a32f-486d82cd63c4\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738335 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-client-ca\") pod \"4551cd47-079d-4f77-939b-a32ae73acca0\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738364 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4551cd47-079d-4f77-939b-a32ae73acca0-serving-cert\") pod \"4551cd47-079d-4f77-939b-a32ae73acca0\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738397 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-proxy-ca-bundles\") pod \"4551cd47-079d-4f77-939b-a32ae73acca0\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738445 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prnpx\" (UniqueName: \"kubernetes.io/projected/4551cd47-079d-4f77-939b-a32ae73acca0-kube-api-access-prnpx\") pod \"4551cd47-079d-4f77-939b-a32ae73acca0\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738481 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-config\") pod \"264e677e-e90f-4bfc-a32f-486d82cd63c4\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738509 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l22\" (UniqueName: \"kubernetes.io/projected/264e677e-e90f-4bfc-a32f-486d82cd63c4-kube-api-access-p4l22\") pod \"264e677e-e90f-4bfc-a32f-486d82cd63c4\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738534 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-config\") pod \"4551cd47-079d-4f77-939b-a32ae73acca0\" (UID: \"4551cd47-079d-4f77-939b-a32ae73acca0\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738631 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-client-ca\") pod \"264e677e-e90f-4bfc-a32f-486d82cd63c4\" (UID: \"264e677e-e90f-4bfc-a32f-486d82cd63c4\") " Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738801 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k766n\" (UniqueName: \"kubernetes.io/projected/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-kube-api-access-k766n\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738896 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-client-ca\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738952 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-serving-cert\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.738988 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-config\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.739213 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4551cd47-079d-4f77-939b-a32ae73acca0" (UID: "4551cd47-079d-4f77-939b-a32ae73acca0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.739930 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-client-ca" (OuterVolumeSpecName: "client-ca") pod "4551cd47-079d-4f77-939b-a32ae73acca0" (UID: "4551cd47-079d-4f77-939b-a32ae73acca0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.740960 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-config" (OuterVolumeSpecName: "config") pod "4551cd47-079d-4f77-939b-a32ae73acca0" (UID: "4551cd47-079d-4f77-939b-a32ae73acca0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.741353 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "264e677e-e90f-4bfc-a32f-486d82cd63c4" (UID: "264e677e-e90f-4bfc-a32f-486d82cd63c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.741512 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-config" (OuterVolumeSpecName: "config") pod "264e677e-e90f-4bfc-a32f-486d82cd63c4" (UID: "264e677e-e90f-4bfc-a32f-486d82cd63c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.755218 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264e677e-e90f-4bfc-a32f-486d82cd63c4-kube-api-access-p4l22" (OuterVolumeSpecName: "kube-api-access-p4l22") pod "264e677e-e90f-4bfc-a32f-486d82cd63c4" (UID: "264e677e-e90f-4bfc-a32f-486d82cd63c4"). InnerVolumeSpecName "kube-api-access-p4l22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.755468 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264e677e-e90f-4bfc-a32f-486d82cd63c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "264e677e-e90f-4bfc-a32f-486d82cd63c4" (UID: "264e677e-e90f-4bfc-a32f-486d82cd63c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.755510 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4551cd47-079d-4f77-939b-a32ae73acca0-kube-api-access-prnpx" (OuterVolumeSpecName: "kube-api-access-prnpx") pod "4551cd47-079d-4f77-939b-a32ae73acca0" (UID: "4551cd47-079d-4f77-939b-a32ae73acca0"). InnerVolumeSpecName "kube-api-access-prnpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.768677 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4551cd47-079d-4f77-939b-a32ae73acca0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4551cd47-079d-4f77-939b-a32ae73acca0" (UID: "4551cd47-079d-4f77-939b-a32ae73acca0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840452 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-client-ca\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840782 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-serving-cert\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840810 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-config\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840833 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k766n\" (UniqueName: \"kubernetes.io/projected/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-kube-api-access-k766n\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840902 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840913 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/264e677e-e90f-4bfc-a32f-486d82cd63c4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840921 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840930 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4551cd47-079d-4f77-939b-a32ae73acca0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840938 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840949 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prnpx\" (UniqueName: \"kubernetes.io/projected/4551cd47-079d-4f77-939b-a32ae73acca0-kube-api-access-prnpx\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840960 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/264e677e-e90f-4bfc-a32f-486d82cd63c4-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840968 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l22\" (UniqueName: \"kubernetes.io/projected/264e677e-e90f-4bfc-a32f-486d82cd63c4-kube-api-access-p4l22\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.840978 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4551cd47-079d-4f77-939b-a32ae73acca0-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.842421 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-client-ca\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.844839 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-config\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.853413 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-serving-cert\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.894057 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k766n\" (UniqueName: \"kubernetes.io/projected/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-kube-api-access-k766n\") pod \"route-controller-manager-66bb8d59cd-nz7r9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.949079 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lbrtr"] Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.957732 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:44 crc kubenswrapper[4819]: I0228 03:38:44.993782 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 28 03:38:45 crc kubenswrapper[4819]: W0228 03:38:45.048145 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod004f5681_552d_4d79_abeb_10f48d02f407.slice/crio-228571c881ccf18e9a1978900fc6b88d14e7a4c24a7963992b44ef86a830600d WatchSource:0}: Error finding container 228571c881ccf18e9a1978900fc6b88d14e7a4c24a7963992b44ef86a830600d: Status 404 returned error can't find the container with id 228571c881ccf18e9a1978900fc6b88d14e7a4c24a7963992b44ef86a830600d Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.217143 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerStarted","Data":"c6088b36defc72ee8dc953ca118634eee723072a3a092098568d50a920fac981"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.231074 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3611a5582f8e4911d1e30a536c45557c83dbeca96daf77f18b38a5efb31f5be1"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.236776 4819 generic.go:334] "Generic (PLEG): container finished" podID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerID="2662d25b134c07ac5eba4817e09493d351c76aac7da485e2bbcc294e09e46d9e" exitCode=0 Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.236828 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnn2f" event={"ID":"69f8a9ba-33dd-4c1b-8383-6a340bb82292","Type":"ContainerDied","Data":"2662d25b134c07ac5eba4817e09493d351c76aac7da485e2bbcc294e09e46d9e"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.256666 4819 generic.go:334] "Generic (PLEG): container finished" podID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerID="86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65" exitCode=0 Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.256744 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerDied","Data":"86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.269498 4819 generic.go:334] "Generic (PLEG): container finished" podID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerID="3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2" exitCode=0 Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.269575 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgv4" event={"ID":"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8","Type":"ContainerDied","Data":"3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.290120 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f2f6f5bfa59ded15d5002033d09417d9a5f07edfbc93bfbf5b61247b72a44dbe"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.291977 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=39.291942284 podStartE2EDuration="39.291942284s" podCreationTimestamp="2026-02-28 03:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:45.290515898 +0000 UTC m=+263.756084756" watchObservedRunningTime="2026-02-28 03:38:45.291942284 +0000 UTC m=+263.757511142" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.298146 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" event={"ID":"e7eede0c-6dc0-48ac-8065-7e0d9ed91212","Type":"ContainerStarted","Data":"6512e0380c673f1c68eb6ee99f2844597fb826633048dfc325f02a912714379b"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.312882 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.316723 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2" event={"ID":"264e677e-e90f-4bfc-a32f-486d82cd63c4","Type":"ContainerDied","Data":"a777eafa9246410a4e5ef0593fd8b31d9caa9175d559fcb5290ffd3ebeb7d740"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.316784 4819 scope.go:117] "RemoveContainer" containerID="69eb3dbe97c4c675300eff8ba6cf2c4ce188157cb860f7f984b88d67e59b1cff" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.333494 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"31eeed816703dbe2e24dc3104b977ec6ee31c977adcc9de1bba34ab9795ec78d"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.334036 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.335544 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537498-5h22n" event={"ID":"500425cb-63aa-43e1-bc7b-2c11c88826c5","Type":"ContainerStarted","Data":"333624774a7f72cc869a41ab7cf3fe9690ef9370fe3a1edb460d898b9163deab"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.338137 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"004f5681-552d-4d79-abeb-10f48d02f407","Type":"ContainerStarted","Data":"228571c881ccf18e9a1978900fc6b88d14e7a4c24a7963992b44ef86a830600d"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.339196 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" event={"ID":"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f","Type":"ContainerStarted","Data":"e322a2740de3b94d2c2051019d809947c93aaa50b2645aea27400ae4a384084a"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.339778 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.354041 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" event={"ID":"85ff71ae-ba6c-4f29-9645-1eb02dc904f1","Type":"ContainerStarted","Data":"0b22b606af0df8150ff0343246dbfb4e26dc346c0a8b8eef7e025291f0c663fe"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.357214 4819 generic.go:334] "Generic (PLEG): container finished" podID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerID="c96372d6bbf465c6dd30897be488008eb8733c6ebbf7e237c872360d55d8e964" exitCode=0 Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.357314 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerDied","Data":"c96372d6bbf465c6dd30897be488008eb8733c6ebbf7e237c872360d55d8e964"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.381745 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" event={"ID":"4551cd47-079d-4f77-939b-a32ae73acca0","Type":"ContainerDied","Data":"509e67509e1f83b29db5cede0262185e9cf17401270234211315a51086c555ec"} Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.381823 4819 scope.go:117] "RemoveContainer" containerID="b3660d3e01aef67b5920dd72e65c97bca09b9af0010ee40bb5bb03dcc2201811" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.382139 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c7b554d99-59qc7" Feb 28 03:38:45 crc kubenswrapper[4819]: E0228 03:38:45.388466 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8mg46" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.408076 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9"] Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.423939 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" podStartSLOduration=205.423917337 podStartE2EDuration="3m25.423917337s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:45.399161039 +0000 UTC m=+263.864729907" watchObservedRunningTime="2026-02-28 03:38:45.423917337 +0000 UTC m=+263.889486195" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.428171 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29537498-5h22n" podStartSLOduration=21.789685156 podStartE2EDuration="45.428154672s" podCreationTimestamp="2026-02-28 03:38:00 +0000 UTC" firstStartedPulling="2026-02-28 03:38:00.919574269 +0000 UTC m=+219.385143127" lastFinishedPulling="2026-02-28 03:38:24.558043745 +0000 UTC m=+243.023612643" observedRunningTime="2026-02-28 03:38:45.416146143 +0000 UTC m=+263.881715001" watchObservedRunningTime="2026-02-28 03:38:45.428154672 +0000 UTC m=+263.893723530" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.444627 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-b72rs" podStartSLOduration=52.444611043 podStartE2EDuration="52.444611043s" podCreationTimestamp="2026-02-28 03:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:45.442174732 +0000 UTC m=+263.907743590" watchObservedRunningTime="2026-02-28 03:38:45.444611043 +0000 UTC m=+263.910179901" Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.472972 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2"] Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.476406 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66cfdbc577-dbhm2"] Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.518305 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c7b554d99-59qc7"] Feb 28 03:38:45 crc kubenswrapper[4819]: I0228 03:38:45.521188 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c7b554d99-59qc7"] Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.378204 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264e677e-e90f-4bfc-a32f-486d82cd63c4" path="/var/lib/kubelet/pods/264e677e-e90f-4bfc-a32f-486d82cd63c4/volumes" Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.379451 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4551cd47-079d-4f77-939b-a32ae73acca0" path="/var/lib/kubelet/pods/4551cd47-079d-4f77-939b-a32ae73acca0/volumes" Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.398715 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" event={"ID":"e7eede0c-6dc0-48ac-8065-7e0d9ed91212","Type":"ContainerStarted","Data":"53fd07b116326cc297008479b29a56d146c1cca1ba1afc2f5475cd228b31b0cf"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.398756 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lbrtr" event={"ID":"e7eede0c-6dc0-48ac-8065-7e0d9ed91212","Type":"ContainerStarted","Data":"2df3c63739b0b178b9678afc24f45d6a1660c2f0ac840a6dfcc672a2258cea04"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.404450 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" event={"ID":"c8e89cb8-dad5-4343-9e17-cb518a1c77b9","Type":"ContainerStarted","Data":"f94abce806a1b4956016af7deac6b85f345a4efb773734a9a6f62e7d2006027b"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.404474 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" event={"ID":"c8e89cb8-dad5-4343-9e17-cb518a1c77b9","Type":"ContainerStarted","Data":"501e882fabd12f007914ae6a976910471ed0f38290d570cbf4f6dc082ed4904b"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.405006 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.409299 4819 generic.go:334] "Generic (PLEG): container finished" podID="4421e7f5-138c-48da-9252-80593db19b91" containerID="c6088b36defc72ee8dc953ca118634eee723072a3a092098568d50a920fac981" exitCode=0 Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.409392 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerDied","Data":"c6088b36defc72ee8dc953ca118634eee723072a3a092098568d50a920fac981"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.412147 4819 generic.go:334] "Generic (PLEG): container finished" podID="500425cb-63aa-43e1-bc7b-2c11c88826c5" containerID="333624774a7f72cc869a41ab7cf3fe9690ef9370fe3a1edb460d898b9163deab" exitCode=0 Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.412272 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537498-5h22n" event={"ID":"500425cb-63aa-43e1-bc7b-2c11c88826c5","Type":"ContainerDied","Data":"333624774a7f72cc869a41ab7cf3fe9690ef9370fe3a1edb460d898b9163deab"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.417121 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.418893 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lbrtr" podStartSLOduration=206.418876968 podStartE2EDuration="3m26.418876968s" podCreationTimestamp="2026-02-28 03:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:46.414681864 +0000 UTC m=+264.880250732" watchObservedRunningTime="2026-02-28 03:38:46.418876968 +0000 UTC m=+264.884445826" Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.419734 4819 generic.go:334] "Generic (PLEG): container finished" podID="d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd" containerID="5c0a13ed2e1e0416256d5cb61c72b756e4c8dea0478ff730cfede05191c6eb9d" exitCode=0 Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.419997 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd","Type":"ContainerDied","Data":"5c0a13ed2e1e0416256d5cb61c72b756e4c8dea0478ff730cfede05191c6eb9d"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.425586 4819 generic.go:334] "Generic (PLEG): container finished" podID="004f5681-552d-4d79-abeb-10f48d02f407" containerID="1efc430478fed05f37d98153400a868622f44c6590b7d52dde55e519905e5b40" exitCode=0 Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.425983 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"004f5681-552d-4d79-abeb-10f48d02f407","Type":"ContainerDied","Data":"1efc430478fed05f37d98153400a868622f44c6590b7d52dde55e519905e5b40"} Feb 28 03:38:46 crc kubenswrapper[4819]: I0228 03:38:46.463094 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" podStartSLOduration=3.463075711 podStartE2EDuration="3.463075711s" podCreationTimestamp="2026-02-28 03:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:46.459986714 +0000 UTC m=+264.925555572" watchObservedRunningTime="2026-02-28 03:38:46.463075711 +0000 UTC m=+264.928644569" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.080728 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dfdb5556c-xm7bx"] Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.081712 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.085056 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.085220 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.085470 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.085877 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.086036 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.086068 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dfdb5556c-xm7bx"] Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.087076 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.090728 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.188095 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-proxy-ca-bundles\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.188158 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae37104f-2a64-46f3-93e7-37b080445e85-serving-cert\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.188196 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l94x\" (UniqueName: \"kubernetes.io/projected/ae37104f-2a64-46f3-93e7-37b080445e85-kube-api-access-9l94x\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.188503 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-client-ca\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.188572 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-config\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.289678 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-proxy-ca-bundles\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.289730 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae37104f-2a64-46f3-93e7-37b080445e85-serving-cert\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.289770 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l94x\" (UniqueName: \"kubernetes.io/projected/ae37104f-2a64-46f3-93e7-37b080445e85-kube-api-access-9l94x\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.289827 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-client-ca\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.289851 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-config\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.290818 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-client-ca\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.291523 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-proxy-ca-bundles\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.291670 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-config\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.298006 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae37104f-2a64-46f3-93e7-37b080445e85-serving-cert\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.307057 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l94x\" (UniqueName: \"kubernetes.io/projected/ae37104f-2a64-46f3-93e7-37b080445e85-kube-api-access-9l94x\") pod \"controller-manager-dfdb5556c-xm7bx\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.402085 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.434155 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerStarted","Data":"b418e3e4f160f2d02be1019d365d8e8f1a3bfd82002ecbac8181e853c72acf83"} Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.452877 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g84jc" podStartSLOduration=2.42358562 podStartE2EDuration="42.452859925s" podCreationTimestamp="2026-02-28 03:38:05 +0000 UTC" firstStartedPulling="2026-02-28 03:38:06.821918209 +0000 UTC m=+225.287487067" lastFinishedPulling="2026-02-28 03:38:46.851192514 +0000 UTC m=+265.316761372" observedRunningTime="2026-02-28 03:38:47.451228704 +0000 UTC m=+265.916797562" watchObservedRunningTime="2026-02-28 03:38:47.452859925 +0000 UTC m=+265.918428783" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.706623 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.707996 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.720866 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.766396 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.771687 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dfdb5556c-xm7bx"] Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.775350 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.784820 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.796572 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.796652 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-var-lock\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.796679 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8da55153-bfac-4bce-8c0b-2ad25d556549-kube-api-access\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.897385 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kube-api-access\") pod \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.897438 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whsnn\" (UniqueName: \"kubernetes.io/projected/500425cb-63aa-43e1-bc7b-2c11c88826c5-kube-api-access-whsnn\") pod \"500425cb-63aa-43e1-bc7b-2c11c88826c5\" (UID: \"500425cb-63aa-43e1-bc7b-2c11c88826c5\") " Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.897460 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/004f5681-552d-4d79-abeb-10f48d02f407-kubelet-dir\") pod \"004f5681-552d-4d79-abeb-10f48d02f407\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.897519 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/004f5681-552d-4d79-abeb-10f48d02f407-kube-api-access\") pod \"004f5681-552d-4d79-abeb-10f48d02f407\" (UID: \"004f5681-552d-4d79-abeb-10f48d02f407\") " Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.897540 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kubelet-dir\") pod \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\" (UID: \"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd\") " Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.897819 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/004f5681-552d-4d79-abeb-10f48d02f407-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "004f5681-552d-4d79-abeb-10f48d02f407" (UID: "004f5681-552d-4d79-abeb-10f48d02f407"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.898504 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-var-lock\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.898562 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8da55153-bfac-4bce-8c0b-2ad25d556549-kube-api-access\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.898665 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.898670 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd" (UID: "d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.898798 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-var-lock\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.898833 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.899295 4819 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.899312 4819 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/004f5681-552d-4d79-abeb-10f48d02f407-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.902762 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd" (UID: "d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.903888 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/004f5681-552d-4d79-abeb-10f48d02f407-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "004f5681-552d-4d79-abeb-10f48d02f407" (UID: "004f5681-552d-4d79-abeb-10f48d02f407"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.903990 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500425cb-63aa-43e1-bc7b-2c11c88826c5-kube-api-access-whsnn" (OuterVolumeSpecName: "kube-api-access-whsnn") pod "500425cb-63aa-43e1-bc7b-2c11c88826c5" (UID: "500425cb-63aa-43e1-bc7b-2c11c88826c5"). InnerVolumeSpecName "kube-api-access-whsnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:38:47 crc kubenswrapper[4819]: I0228 03:38:47.919895 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8da55153-bfac-4bce-8c0b-2ad25d556549-kube-api-access\") pod \"installer-9-crc\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.000182 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.000580 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whsnn\" (UniqueName: \"kubernetes.io/projected/500425cb-63aa-43e1-bc7b-2c11c88826c5-kube-api-access-whsnn\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.000591 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/004f5681-552d-4d79-abeb-10f48d02f407-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.037673 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.450186 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.450204 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd","Type":"ContainerDied","Data":"33abf535c0627c94e3f9f33f43e9d550a98c2928086936a9e62502185b6e599e"} Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.451268 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33abf535c0627c94e3f9f33f43e9d550a98c2928086936a9e62502185b6e599e" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.453660 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"004f5681-552d-4d79-abeb-10f48d02f407","Type":"ContainerDied","Data":"228571c881ccf18e9a1978900fc6b88d14e7a4c24a7963992b44ef86a830600d"} Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.453699 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228571c881ccf18e9a1978900fc6b88d14e7a4c24a7963992b44ef86a830600d" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.453729 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.456294 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerStarted","Data":"8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659"} Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.461635 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" event={"ID":"ae37104f-2a64-46f3-93e7-37b080445e85","Type":"ContainerStarted","Data":"b414e136fb8f979eb8fa7f3a735500db2a453662af49e2fbc24f78b5494c0cc7"} Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.461679 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" event={"ID":"ae37104f-2a64-46f3-93e7-37b080445e85","Type":"ContainerStarted","Data":"81c40b478b91053cf7a1f43bcaefd9ad2aa1fa78e6ac19b72e1ce973231dc39f"} Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.462403 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.464726 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537498-5h22n" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.465086 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537498-5h22n" event={"ID":"500425cb-63aa-43e1-bc7b-2c11c88826c5","Type":"ContainerDied","Data":"472faf0bfb7da0107e836214b705e09a0e1da2f1acc37474d5e93b0faa58d1f4"} Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.465102 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472faf0bfb7da0107e836214b705e09a0e1da2f1acc37474d5e93b0faa58d1f4" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.480831 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.500283 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" podStartSLOduration=5.500266976 podStartE2EDuration="5.500266976s" podCreationTimestamp="2026-02-28 03:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:48.499687752 +0000 UTC m=+266.965256620" watchObservedRunningTime="2026-02-28 03:38:48.500266976 +0000 UTC m=+266.965835834" Feb 28 03:38:48 crc kubenswrapper[4819]: I0228 03:38:48.630884 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 28 03:38:49 crc kubenswrapper[4819]: I0228 03:38:49.473402 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8da55153-bfac-4bce-8c0b-2ad25d556549","Type":"ContainerStarted","Data":"0a943832223bbf0cc6819cf36f1155398918ffafdc28d4a25b1d908e5b4e5bd6"} Feb 28 03:38:49 crc kubenswrapper[4819]: I0228 03:38:49.474159 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8da55153-bfac-4bce-8c0b-2ad25d556549","Type":"ContainerStarted","Data":"bceeb9aed333b9ecf850d28568521ec990e2b11fc531f7f7af93a30731d4c1c9"} Feb 28 03:38:49 crc kubenswrapper[4819]: I0228 03:38:49.476290 4819 generic.go:334] "Generic (PLEG): container finished" podID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerID="8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659" exitCode=0 Feb 28 03:38:49 crc kubenswrapper[4819]: I0228 03:38:49.476441 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerDied","Data":"8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659"} Feb 28 03:38:49 crc kubenswrapper[4819]: I0228 03:38:49.495050 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.495024373 podStartE2EDuration="2.495024373s" podCreationTimestamp="2026-02-28 03:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:38:49.492266095 +0000 UTC m=+267.957834973" watchObservedRunningTime="2026-02-28 03:38:49.495024373 +0000 UTC m=+267.960593251" Feb 28 03:38:51 crc kubenswrapper[4819]: I0228 03:38:51.493961 4819 generic.go:334] "Generic (PLEG): container finished" podID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerID="4fa04afe4493194bec58c9b195f70b2422a55c09b9172df0dfd27f19016af7cd" exitCode=0 Feb 28 03:38:51 crc kubenswrapper[4819]: I0228 03:38:51.494196 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnn2f" event={"ID":"69f8a9ba-33dd-4c1b-8383-6a340bb82292","Type":"ContainerDied","Data":"4fa04afe4493194bec58c9b195f70b2422a55c09b9172df0dfd27f19016af7cd"} Feb 28 03:38:51 crc kubenswrapper[4819]: I0228 03:38:51.503570 4819 generic.go:334] "Generic (PLEG): container finished" podID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerID="0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec" exitCode=0 Feb 28 03:38:51 crc kubenswrapper[4819]: I0228 03:38:51.503607 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgv4" event={"ID":"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8","Type":"ContainerDied","Data":"0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec"} Feb 28 03:38:52 crc kubenswrapper[4819]: I0228 03:38:52.520490 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerStarted","Data":"0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61"} Feb 28 03:38:52 crc kubenswrapper[4819]: I0228 03:38:52.540294 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hnqz" podStartSLOduration=4.007956939 podStartE2EDuration="48.540274838s" podCreationTimestamp="2026-02-28 03:38:04 +0000 UTC" firstStartedPulling="2026-02-28 03:38:06.810197047 +0000 UTC m=+225.275765905" lastFinishedPulling="2026-02-28 03:38:51.342514946 +0000 UTC m=+269.808083804" observedRunningTime="2026-02-28 03:38:52.535412587 +0000 UTC m=+271.000981445" watchObservedRunningTime="2026-02-28 03:38:52.540274838 +0000 UTC m=+271.005843696" Feb 28 03:38:55 crc kubenswrapper[4819]: I0228 03:38:55.312127 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:55 crc kubenswrapper[4819]: I0228 03:38:55.312193 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:55 crc kubenswrapper[4819]: I0228 03:38:55.465932 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:38:55 crc kubenswrapper[4819]: I0228 03:38:55.708624 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:55 crc kubenswrapper[4819]: I0228 03:38:55.708665 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:55 crc kubenswrapper[4819]: I0228 03:38:55.752691 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:56 crc kubenswrapper[4819]: I0228 03:38:56.994562 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:38:57 crc kubenswrapper[4819]: I0228 03:38:57.444695 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g84jc"] Feb 28 03:38:58 crc kubenswrapper[4819]: I0228 03:38:58.939117 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnn2f" event={"ID":"69f8a9ba-33dd-4c1b-8383-6a340bb82292","Type":"ContainerStarted","Data":"5e0ae21cb78c6699c73a1b06d424ac1aea2d042875c79ee6322d00cebb19e8ff"} Feb 28 03:38:58 crc kubenswrapper[4819]: I0228 03:38:58.940898 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerStarted","Data":"3b975070c2552b8ecfa33baecf0fef72d316ee9f3451477555cc98b0d531b709"} Feb 28 03:38:58 crc kubenswrapper[4819]: I0228 03:38:58.943308 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g84jc" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="registry-server" containerID="cri-o://b418e3e4f160f2d02be1019d365d8e8f1a3bfd82002ecbac8181e853c72acf83" gracePeriod=2 Feb 28 03:38:58 crc kubenswrapper[4819]: I0228 03:38:58.943780 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerStarted","Data":"13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2"} Feb 28 03:38:58 crc kubenswrapper[4819]: I0228 03:38:58.956313 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rnn2f" podStartSLOduration=39.856692013 podStartE2EDuration="51.956293568s" podCreationTimestamp="2026-02-28 03:38:07 +0000 UTC" firstStartedPulling="2026-02-28 03:38:45.237904716 +0000 UTC m=+263.703473574" lastFinishedPulling="2026-02-28 03:38:57.337506241 +0000 UTC m=+275.803075129" observedRunningTime="2026-02-28 03:38:58.95396901 +0000 UTC m=+277.419537888" watchObservedRunningTime="2026-02-28 03:38:58.956293568 +0000 UTC m=+277.421862446" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.950868 4819 generic.go:334] "Generic (PLEG): container finished" podID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerID="13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2" exitCode=0 Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.950950 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerDied","Data":"13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2"} Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.952572 4819 generic.go:334] "Generic (PLEG): container finished" podID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerID="3b975070c2552b8ecfa33baecf0fef72d316ee9f3451477555cc98b0d531b709" exitCode=0 Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.952638 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerDied","Data":"3b975070c2552b8ecfa33baecf0fef72d316ee9f3451477555cc98b0d531b709"} Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.958273 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgv4" event={"ID":"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8","Type":"ContainerStarted","Data":"f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7"} Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.960600 4819 generic.go:334] "Generic (PLEG): container finished" podID="4421e7f5-138c-48da-9252-80593db19b91" containerID="b418e3e4f160f2d02be1019d365d8e8f1a3bfd82002ecbac8181e853c72acf83" exitCode=0 Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.960643 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerDied","Data":"b418e3e4f160f2d02be1019d365d8e8f1a3bfd82002ecbac8181e853c72acf83"} Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:38:59.995256 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxgv4" podStartSLOduration=40.522035534 podStartE2EDuration="53.995222068s" podCreationTimestamp="2026-02-28 03:38:06 +0000 UTC" firstStartedPulling="2026-02-28 03:38:45.271809102 +0000 UTC m=+263.737377960" lastFinishedPulling="2026-02-28 03:38:58.744995636 +0000 UTC m=+277.210564494" observedRunningTime="2026-02-28 03:38:59.992582622 +0000 UTC m=+278.458151480" watchObservedRunningTime="2026-02-28 03:38:59.995222068 +0000 UTC m=+278.460790926" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.834315 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.834684 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.879469 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.971572 4819 generic.go:334] "Generic (PLEG): container finished" podID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerID="4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1" exitCode=0 Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.971666 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkx4p" event={"ID":"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a","Type":"ContainerDied","Data":"4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1"} Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.974950 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g84jc" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.974996 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g84jc" event={"ID":"4421e7f5-138c-48da-9252-80593db19b91","Type":"ContainerDied","Data":"946016bbbc55aa792f390c94887ca884a3843b31823b607e59a738d8be1baf80"} Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.975032 4819 scope.go:117] "RemoveContainer" containerID="b418e3e4f160f2d02be1019d365d8e8f1a3bfd82002ecbac8181e853c72acf83" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.975943 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnc8w\" (UniqueName: \"kubernetes.io/projected/4421e7f5-138c-48da-9252-80593db19b91-kube-api-access-pnc8w\") pod \"4421e7f5-138c-48da-9252-80593db19b91\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.976055 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-utilities\") pod \"4421e7f5-138c-48da-9252-80593db19b91\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.976104 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-catalog-content\") pod \"4421e7f5-138c-48da-9252-80593db19b91\" (UID: \"4421e7f5-138c-48da-9252-80593db19b91\") " Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.976717 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-utilities" (OuterVolumeSpecName: "utilities") pod "4421e7f5-138c-48da-9252-80593db19b91" (UID: "4421e7f5-138c-48da-9252-80593db19b91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:00 crc kubenswrapper[4819]: I0228 03:39:00.984040 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4421e7f5-138c-48da-9252-80593db19b91-kube-api-access-pnc8w" (OuterVolumeSpecName: "kube-api-access-pnc8w") pod "4421e7f5-138c-48da-9252-80593db19b91" (UID: "4421e7f5-138c-48da-9252-80593db19b91"). InnerVolumeSpecName "kube-api-access-pnc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.036472 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4421e7f5-138c-48da-9252-80593db19b91" (UID: "4421e7f5-138c-48da-9252-80593db19b91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.071990 4819 scope.go:117] "RemoveContainer" containerID="c6088b36defc72ee8dc953ca118634eee723072a3a092098568d50a920fac981" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.076823 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.076845 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnc8w\" (UniqueName: \"kubernetes.io/projected/4421e7f5-138c-48da-9252-80593db19b91-kube-api-access-pnc8w\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.076858 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4421e7f5-138c-48da-9252-80593db19b91-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.092987 4819 scope.go:117] "RemoveContainer" containerID="7086958c75ac1b143f72111390cf344485bd7fdb0a0746fffb54a72894dc2220" Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.305719 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g84jc"] Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.308872 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g84jc"] Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.981659 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerStarted","Data":"0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3"} Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.985063 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerStarted","Data":"77974f021a1f64c35765c2a445112dc19f3a3487f10e5660445d1d24e7ac4f18"} Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.987363 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerStarted","Data":"1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b"} Feb 28 03:39:01 crc kubenswrapper[4819]: I0228 03:39:01.989806 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkx4p" event={"ID":"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a","Type":"ContainerStarted","Data":"3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706"} Feb 28 03:39:02 crc kubenswrapper[4819]: I0228 03:39:02.023383 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bkx4p" podStartSLOduration=2.24037216 podStartE2EDuration="57.023351056s" podCreationTimestamp="2026-02-28 03:38:05 +0000 UTC" firstStartedPulling="2026-02-28 03:38:06.802325221 +0000 UTC m=+225.267894079" lastFinishedPulling="2026-02-28 03:39:01.585304117 +0000 UTC m=+280.050872975" observedRunningTime="2026-02-28 03:39:02.021003077 +0000 UTC m=+280.486571925" watchObservedRunningTime="2026-02-28 03:39:02.023351056 +0000 UTC m=+280.488919914" Feb 28 03:39:02 crc kubenswrapper[4819]: I0228 03:39:02.034600 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ggnf2" podStartSLOduration=38.007091066 podStartE2EDuration="54.034574026s" podCreationTimestamp="2026-02-28 03:38:08 +0000 UTC" firstStartedPulling="2026-02-28 03:38:45.359154671 +0000 UTC m=+263.824723529" lastFinishedPulling="2026-02-28 03:39:01.386637631 +0000 UTC m=+279.852206489" observedRunningTime="2026-02-28 03:39:02.033956471 +0000 UTC m=+280.499525329" watchObservedRunningTime="2026-02-28 03:39:02.034574026 +0000 UTC m=+280.500142884" Feb 28 03:39:02 crc kubenswrapper[4819]: I0228 03:39:02.050298 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zhxpx" podStartSLOduration=37.789141498 podStartE2EDuration="54.050235087s" podCreationTimestamp="2026-02-28 03:38:08 +0000 UTC" firstStartedPulling="2026-02-28 03:38:45.263497814 +0000 UTC m=+263.729066672" lastFinishedPulling="2026-02-28 03:39:01.524591403 +0000 UTC m=+279.990160261" observedRunningTime="2026-02-28 03:39:02.047913169 +0000 UTC m=+280.513482027" watchObservedRunningTime="2026-02-28 03:39:02.050235087 +0000 UTC m=+280.515803945" Feb 28 03:39:02 crc kubenswrapper[4819]: I0228 03:39:02.376475 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4421e7f5-138c-48da-9252-80593db19b91" path="/var/lib/kubelet/pods/4421e7f5-138c-48da-9252-80593db19b91/volumes" Feb 28 03:39:03 crc kubenswrapper[4819]: I0228 03:39:03.002884 4819 generic.go:334] "Generic (PLEG): container finished" podID="bb2d247d-3cce-4daa-9a51-20769f987756" containerID="0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3" exitCode=0 Feb 28 03:39:03 crc kubenswrapper[4819]: I0228 03:39:03.002951 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerDied","Data":"0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3"} Feb 28 03:39:03 crc kubenswrapper[4819]: I0228 03:39:03.081392 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dfdb5556c-xm7bx"] Feb 28 03:39:03 crc kubenswrapper[4819]: I0228 03:39:03.081594 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" podUID="ae37104f-2a64-46f3-93e7-37b080445e85" containerName="controller-manager" containerID="cri-o://b414e136fb8f979eb8fa7f3a735500db2a453662af49e2fbc24f78b5494c0cc7" gracePeriod=30 Feb 28 03:39:03 crc kubenswrapper[4819]: I0228 03:39:03.118071 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9"] Feb 28 03:39:03 crc kubenswrapper[4819]: I0228 03:39:03.118402 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" podUID="c8e89cb8-dad5-4343-9e17-cb518a1c77b9" containerName="route-controller-manager" containerID="cri-o://f94abce806a1b4956016af7deac6b85f345a4efb773734a9a6f62e7d2006027b" gracePeriod=30 Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.010068 4819 generic.go:334] "Generic (PLEG): container finished" podID="c8e89cb8-dad5-4343-9e17-cb518a1c77b9" containerID="f94abce806a1b4956016af7deac6b85f345a4efb773734a9a6f62e7d2006027b" exitCode=0 Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.010262 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" event={"ID":"c8e89cb8-dad5-4343-9e17-cb518a1c77b9","Type":"ContainerDied","Data":"f94abce806a1b4956016af7deac6b85f345a4efb773734a9a6f62e7d2006027b"} Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.012949 4819 generic.go:334] "Generic (PLEG): container finished" podID="ae37104f-2a64-46f3-93e7-37b080445e85" containerID="b414e136fb8f979eb8fa7f3a735500db2a453662af49e2fbc24f78b5494c0cc7" exitCode=0 Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.013010 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" event={"ID":"ae37104f-2a64-46f3-93e7-37b080445e85","Type":"ContainerDied","Data":"b414e136fb8f979eb8fa7f3a735500db2a453662af49e2fbc24f78b5494c0cc7"} Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.189446 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.210924 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f"] Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211107 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500425cb-63aa-43e1-bc7b-2c11c88826c5" containerName="oc" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211122 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="500425cb-63aa-43e1-bc7b-2c11c88826c5" containerName="oc" Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211134 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="registry-server" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211141 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="registry-server" Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211153 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="extract-utilities" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211159 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="extract-utilities" Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211166 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e89cb8-dad5-4343-9e17-cb518a1c77b9" containerName="route-controller-manager" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211173 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e89cb8-dad5-4343-9e17-cb518a1c77b9" containerName="route-controller-manager" Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211182 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="004f5681-552d-4d79-abeb-10f48d02f407" containerName="pruner" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211187 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="004f5681-552d-4d79-abeb-10f48d02f407" containerName="pruner" Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211203 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="extract-content" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211209 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="extract-content" Feb 28 03:39:04 crc kubenswrapper[4819]: E0228 03:39:04.211216 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd" containerName="pruner" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211221 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd" containerName="pruner" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211325 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="500425cb-63aa-43e1-bc7b-2c11c88826c5" containerName="oc" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211338 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3891fbd-1ffc-4ab9-8d00-0871e7cabfdd" containerName="pruner" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211347 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="4421e7f5-138c-48da-9252-80593db19b91" containerName="registry-server" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211357 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="004f5681-552d-4d79-abeb-10f48d02f407" containerName="pruner" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211364 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e89cb8-dad5-4343-9e17-cb518a1c77b9" containerName="route-controller-manager" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.211687 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.234027 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f"] Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.315666 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-serving-cert\") pod \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.315728 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-client-ca\") pod \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.315767 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-config\") pod \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.315833 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k766n\" (UniqueName: \"kubernetes.io/projected/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-kube-api-access-k766n\") pod \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\" (UID: \"c8e89cb8-dad5-4343-9e17-cb518a1c77b9\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.315986 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-serving-cert\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.316017 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-config\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.316044 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5428s\" (UniqueName: \"kubernetes.io/projected/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-kube-api-access-5428s\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.316080 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-client-ca\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.316616 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-client-ca" (OuterVolumeSpecName: "client-ca") pod "c8e89cb8-dad5-4343-9e17-cb518a1c77b9" (UID: "c8e89cb8-dad5-4343-9e17-cb518a1c77b9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.317007 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-config" (OuterVolumeSpecName: "config") pod "c8e89cb8-dad5-4343-9e17-cb518a1c77b9" (UID: "c8e89cb8-dad5-4343-9e17-cb518a1c77b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.330824 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c8e89cb8-dad5-4343-9e17-cb518a1c77b9" (UID: "c8e89cb8-dad5-4343-9e17-cb518a1c77b9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.332620 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-kube-api-access-k766n" (OuterVolumeSpecName: "kube-api-access-k766n") pod "c8e89cb8-dad5-4343-9e17-cb518a1c77b9" (UID: "c8e89cb8-dad5-4343-9e17-cb518a1c77b9"). InnerVolumeSpecName "kube-api-access-k766n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.354151 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416788 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5428s\" (UniqueName: \"kubernetes.io/projected/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-kube-api-access-5428s\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416851 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-client-ca\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416900 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-serving-cert\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416929 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-config\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416964 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416977 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416985 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.416996 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k766n\" (UniqueName: \"kubernetes.io/projected/c8e89cb8-dad5-4343-9e17-cb518a1c77b9-kube-api-access-k766n\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.418068 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-client-ca\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.418140 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-config\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.420506 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-serving-cert\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.437854 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5428s\" (UniqueName: \"kubernetes.io/projected/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-kube-api-access-5428s\") pod \"route-controller-manager-6fd55b8b89-qfc4f\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.518455 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-config\") pod \"ae37104f-2a64-46f3-93e7-37b080445e85\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.518696 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-client-ca\") pod \"ae37104f-2a64-46f3-93e7-37b080445e85\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.518764 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l94x\" (UniqueName: \"kubernetes.io/projected/ae37104f-2a64-46f3-93e7-37b080445e85-kube-api-access-9l94x\") pod \"ae37104f-2a64-46f3-93e7-37b080445e85\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.518827 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-proxy-ca-bundles\") pod \"ae37104f-2a64-46f3-93e7-37b080445e85\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.518864 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae37104f-2a64-46f3-93e7-37b080445e85-serving-cert\") pod \"ae37104f-2a64-46f3-93e7-37b080445e85\" (UID: \"ae37104f-2a64-46f3-93e7-37b080445e85\") " Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.519605 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae37104f-2a64-46f3-93e7-37b080445e85" (UID: "ae37104f-2a64-46f3-93e7-37b080445e85"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.519624 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae37104f-2a64-46f3-93e7-37b080445e85" (UID: "ae37104f-2a64-46f3-93e7-37b080445e85"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.519660 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-config" (OuterVolumeSpecName: "config") pod "ae37104f-2a64-46f3-93e7-37b080445e85" (UID: "ae37104f-2a64-46f3-93e7-37b080445e85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.521632 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae37104f-2a64-46f3-93e7-37b080445e85-kube-api-access-9l94x" (OuterVolumeSpecName: "kube-api-access-9l94x") pod "ae37104f-2a64-46f3-93e7-37b080445e85" (UID: "ae37104f-2a64-46f3-93e7-37b080445e85"). InnerVolumeSpecName "kube-api-access-9l94x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.523054 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae37104f-2a64-46f3-93e7-37b080445e85-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae37104f-2a64-46f3-93e7-37b080445e85" (UID: "ae37104f-2a64-46f3-93e7-37b080445e85"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.535074 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.620884 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.620918 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l94x\" (UniqueName: \"kubernetes.io/projected/ae37104f-2a64-46f3-93e7-37b080445e85-kube-api-access-9l94x\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.620932 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.620941 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae37104f-2a64-46f3-93e7-37b080445e85-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:04 crc kubenswrapper[4819]: I0228 03:39:04.620951 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae37104f-2a64-46f3-93e7-37b080445e85-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.022891 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" event={"ID":"c8e89cb8-dad5-4343-9e17-cb518a1c77b9","Type":"ContainerDied","Data":"501e882fabd12f007914ae6a976910471ed0f38290d570cbf4f6dc082ed4904b"} Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.022953 4819 scope.go:117] "RemoveContainer" containerID="f94abce806a1b4956016af7deac6b85f345a4efb773734a9a6f62e7d2006027b" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.023106 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.029215 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.029683 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dfdb5556c-xm7bx" event={"ID":"ae37104f-2a64-46f3-93e7-37b080445e85","Type":"ContainerDied","Data":"81c40b478b91053cf7a1f43bcaefd9ad2aa1fa78e6ac19b72e1ce973231dc39f"} Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.037331 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerStarted","Data":"8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74"} Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.043506 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9"] Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.046017 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f"] Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.048273 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66bb8d59cd-nz7r9"] Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.051612 4819 scope.go:117] "RemoveContainer" containerID="b414e136fb8f979eb8fa7f3a735500db2a453662af49e2fbc24f78b5494c0cc7" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.068467 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dfdb5556c-xm7bx"] Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.071738 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dfdb5556c-xm7bx"] Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.364759 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.515103 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.515156 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:39:05 crc kubenswrapper[4819]: I0228 03:39:05.545948 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:39:06 crc kubenswrapper[4819]: I0228 03:39:06.046546 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" event={"ID":"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e","Type":"ContainerStarted","Data":"bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0"} Feb 28 03:39:06 crc kubenswrapper[4819]: I0228 03:39:06.047334 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" event={"ID":"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e","Type":"ContainerStarted","Data":"864786dfcf2489ad304c73ab0167aca87395f038d7129384c7ea9e8d24401358"} Feb 28 03:39:06 crc kubenswrapper[4819]: I0228 03:39:06.073032 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8mg46" podStartSLOduration=8.042855338 podStartE2EDuration="1m1.073007829s" podCreationTimestamp="2026-02-28 03:38:05 +0000 UTC" firstStartedPulling="2026-02-28 03:38:10.827749534 +0000 UTC m=+229.293318432" lastFinishedPulling="2026-02-28 03:39:03.857902065 +0000 UTC m=+282.323470923" observedRunningTime="2026-02-28 03:39:06.071825539 +0000 UTC m=+284.537394417" watchObservedRunningTime="2026-02-28 03:39:06.073007829 +0000 UTC m=+284.538576687" Feb 28 03:39:06 crc kubenswrapper[4819]: I0228 03:39:06.094390 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:39:06 crc kubenswrapper[4819]: I0228 03:39:06.374539 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae37104f-2a64-46f3-93e7-37b080445e85" path="/var/lib/kubelet/pods/ae37104f-2a64-46f3-93e7-37b080445e85/volumes" Feb 28 03:39:06 crc kubenswrapper[4819]: I0228 03:39:06.375851 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e89cb8-dad5-4343-9e17-cb518a1c77b9" path="/var/lib/kubelet/pods/c8e89cb8-dad5-4343-9e17-cb518a1c77b9/volumes" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.052900 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.057570 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.076641 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" podStartSLOduration=4.076610697 podStartE2EDuration="4.076610697s" podCreationTimestamp="2026-02-28 03:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:39:07.074560846 +0000 UTC m=+285.540129704" watchObservedRunningTime="2026-02-28 03:39:07.076610697 +0000 UTC m=+285.542179595" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.089800 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-999696588-zvjvm"] Feb 28 03:39:07 crc kubenswrapper[4819]: E0228 03:39:07.090161 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae37104f-2a64-46f3-93e7-37b080445e85" containerName="controller-manager" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.090184 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae37104f-2a64-46f3-93e7-37b080445e85" containerName="controller-manager" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.090324 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae37104f-2a64-46f3-93e7-37b080445e85" containerName="controller-manager" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.090873 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.093622 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.093665 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.093725 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.093934 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.093980 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.094395 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.100484 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.111894 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-999696588-zvjvm"] Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.256845 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-client-ca\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.256910 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-config\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.256940 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-proxy-ca-bundles\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.257002 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwdcf\" (UniqueName: \"kubernetes.io/projected/a406ca7e-042c-425d-a88b-150b5f48fb3c-kube-api-access-wwdcf\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.257055 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a406ca7e-042c-425d-a88b-150b5f48fb3c-serving-cert\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.358558 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a406ca7e-042c-425d-a88b-150b5f48fb3c-serving-cert\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.358624 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-client-ca\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.358655 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-config\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.358688 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-proxy-ca-bundles\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.358749 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwdcf\" (UniqueName: \"kubernetes.io/projected/a406ca7e-042c-425d-a88b-150b5f48fb3c-kube-api-access-wwdcf\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.360809 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-client-ca\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.363104 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-config\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.363815 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-proxy-ca-bundles\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.366600 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a406ca7e-042c-425d-a88b-150b5f48fb3c-serving-cert\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.391826 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwdcf\" (UniqueName: \"kubernetes.io/projected/a406ca7e-042c-425d-a88b-150b5f48fb3c-kube-api-access-wwdcf\") pod \"controller-manager-999696588-zvjvm\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.409880 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:07 crc kubenswrapper[4819]: I0228 03:39:07.668083 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-999696588-zvjvm"] Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.059527 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" event={"ID":"a406ca7e-042c-425d-a88b-150b5f48fb3c","Type":"ContainerStarted","Data":"10ba1209cf0438d24659299f526c23bc572aab94261f82581feaa84b771ccc92"} Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.328425 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.335478 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.376910 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.386776 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.387871 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.440683 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.500798 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.747378 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.747775 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:39:08 crc kubenswrapper[4819]: I0228 03:39:08.786422 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.065519 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" event={"ID":"a406ca7e-042c-425d-a88b-150b5f48fb3c","Type":"ContainerStarted","Data":"470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16"} Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.084368 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" podStartSLOduration=6.084342378 podStartE2EDuration="6.084342378s" podCreationTimestamp="2026-02-28 03:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:39:09.083469526 +0000 UTC m=+287.549038384" watchObservedRunningTime="2026-02-28 03:39:09.084342378 +0000 UTC m=+287.549911256" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.103926 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.107080 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.107122 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.108794 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.113231 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:39:09 crc kubenswrapper[4819]: I0228 03:39:09.187695 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:39:10 crc kubenswrapper[4819]: I0228 03:39:10.074940 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:10 crc kubenswrapper[4819]: I0228 03:39:10.079706 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:10 crc kubenswrapper[4819]: I0228 03:39:10.150066 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:39:11 crc kubenswrapper[4819]: I0228 03:39:11.048093 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnn2f"] Feb 28 03:39:11 crc kubenswrapper[4819]: I0228 03:39:11.245466 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggnf2"] Feb 28 03:39:12 crc kubenswrapper[4819]: I0228 03:39:12.094738 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rnn2f" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="registry-server" containerID="cri-o://5e0ae21cb78c6699c73a1b06d424ac1aea2d042875c79ee6322d00cebb19e8ff" gracePeriod=2 Feb 28 03:39:12 crc kubenswrapper[4819]: I0228 03:39:12.094985 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ggnf2" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="registry-server" containerID="cri-o://77974f021a1f64c35765c2a445112dc19f3a3487f10e5660445d1d24e7ac4f18" gracePeriod=2 Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.104202 4819 generic.go:334] "Generic (PLEG): container finished" podID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerID="5e0ae21cb78c6699c73a1b06d424ac1aea2d042875c79ee6322d00cebb19e8ff" exitCode=0 Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.104311 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnn2f" event={"ID":"69f8a9ba-33dd-4c1b-8383-6a340bb82292","Type":"ContainerDied","Data":"5e0ae21cb78c6699c73a1b06d424ac1aea2d042875c79ee6322d00cebb19e8ff"} Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.107136 4819 generic.go:334] "Generic (PLEG): container finished" podID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerID="77974f021a1f64c35765c2a445112dc19f3a3487f10e5660445d1d24e7ac4f18" exitCode=0 Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.107178 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerDied","Data":"77974f021a1f64c35765c2a445112dc19f3a3487f10e5660445d1d24e7ac4f18"} Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.620979 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.688661 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.741486 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4sll\" (UniqueName: \"kubernetes.io/projected/69f8a9ba-33dd-4c1b-8383-6a340bb82292-kube-api-access-r4sll\") pod \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.741550 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-catalog-content\") pod \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.741598 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-utilities\") pod \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\" (UID: \"69f8a9ba-33dd-4c1b-8383-6a340bb82292\") " Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.742947 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-utilities" (OuterVolumeSpecName: "utilities") pod "69f8a9ba-33dd-4c1b-8383-6a340bb82292" (UID: "69f8a9ba-33dd-4c1b-8383-6a340bb82292"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.746835 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f8a9ba-33dd-4c1b-8383-6a340bb82292-kube-api-access-r4sll" (OuterVolumeSpecName: "kube-api-access-r4sll") pod "69f8a9ba-33dd-4c1b-8383-6a340bb82292" (UID: "69f8a9ba-33dd-4c1b-8383-6a340bb82292"). InnerVolumeSpecName "kube-api-access-r4sll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.773841 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69f8a9ba-33dd-4c1b-8383-6a340bb82292" (UID: "69f8a9ba-33dd-4c1b-8383-6a340bb82292"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.842973 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf668\" (UniqueName: \"kubernetes.io/projected/2b7e1686-948f-45a4-b739-5e88a4489b8e-kube-api-access-mf668\") pod \"2b7e1686-948f-45a4-b739-5e88a4489b8e\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.843017 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-utilities\") pod \"2b7e1686-948f-45a4-b739-5e88a4489b8e\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.843076 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-catalog-content\") pod \"2b7e1686-948f-45a4-b739-5e88a4489b8e\" (UID: \"2b7e1686-948f-45a4-b739-5e88a4489b8e\") " Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.843298 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4sll\" (UniqueName: \"kubernetes.io/projected/69f8a9ba-33dd-4c1b-8383-6a340bb82292-kube-api-access-r4sll\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.843310 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.843321 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69f8a9ba-33dd-4c1b-8383-6a340bb82292-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.843780 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-utilities" (OuterVolumeSpecName: "utilities") pod "2b7e1686-948f-45a4-b739-5e88a4489b8e" (UID: "2b7e1686-948f-45a4-b739-5e88a4489b8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.846733 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7e1686-948f-45a4-b739-5e88a4489b8e-kube-api-access-mf668" (OuterVolumeSpecName: "kube-api-access-mf668") pod "2b7e1686-948f-45a4-b739-5e88a4489b8e" (UID: "2b7e1686-948f-45a4-b739-5e88a4489b8e"). InnerVolumeSpecName "kube-api-access-mf668". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.944765 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf668\" (UniqueName: \"kubernetes.io/projected/2b7e1686-948f-45a4-b739-5e88a4489b8e-kube-api-access-mf668\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:13 crc kubenswrapper[4819]: I0228 03:39:13.945040 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.116223 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggnf2" event={"ID":"2b7e1686-948f-45a4-b739-5e88a4489b8e","Type":"ContainerDied","Data":"2277b19318a132c806884b99604e2560c8bcc948eeaa24faeeb676850b0ec575"} Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.116336 4819 scope.go:117] "RemoveContainer" containerID="77974f021a1f64c35765c2a445112dc19f3a3487f10e5660445d1d24e7ac4f18" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.116495 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggnf2" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.121416 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnn2f" event={"ID":"69f8a9ba-33dd-4c1b-8383-6a340bb82292","Type":"ContainerDied","Data":"dd61951b140ab3d8df38c00f66230ce8f59b2625b31437df379c22b7a850cc13"} Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.121515 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnn2f" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.141167 4819 scope.go:117] "RemoveContainer" containerID="3b975070c2552b8ecfa33baecf0fef72d316ee9f3451477555cc98b0d531b709" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.193800 4819 scope.go:117] "RemoveContainer" containerID="c96372d6bbf465c6dd30897be488008eb8733c6ebbf7e237c872360d55d8e964" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.201060 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnn2f"] Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.207396 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnn2f"] Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.228501 4819 scope.go:117] "RemoveContainer" containerID="5e0ae21cb78c6699c73a1b06d424ac1aea2d042875c79ee6322d00cebb19e8ff" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.243204 4819 scope.go:117] "RemoveContainer" containerID="4fa04afe4493194bec58c9b195f70b2422a55c09b9172df0dfd27f19016af7cd" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.256959 4819 scope.go:117] "RemoveContainer" containerID="2662d25b134c07ac5eba4817e09493d351c76aac7da485e2bbcc294e09e46d9e" Feb 28 03:39:14 crc kubenswrapper[4819]: I0228 03:39:14.382471 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" path="/var/lib/kubelet/pods/69f8a9ba-33dd-4c1b-8383-6a340bb82292/volumes" Feb 28 03:39:15 crc kubenswrapper[4819]: I0228 03:39:15.891865 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w5fln"] Feb 28 03:39:15 crc kubenswrapper[4819]: I0228 03:39:15.912134 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:39:15 crc kubenswrapper[4819]: I0228 03:39:15.912211 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:39:16 crc kubenswrapper[4819]: I0228 03:39:16.006524 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:39:16 crc kubenswrapper[4819]: I0228 03:39:16.200210 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:39:18 crc kubenswrapper[4819]: I0228 03:39:18.326656 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b7e1686-948f-45a4-b739-5e88a4489b8e" (UID: "2b7e1686-948f-45a4-b739-5e88a4489b8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:18 crc kubenswrapper[4819]: I0228 03:39:18.413089 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b7e1686-948f-45a4-b739-5e88a4489b8e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:18 crc kubenswrapper[4819]: I0228 03:39:18.637680 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 28 03:39:18 crc kubenswrapper[4819]: I0228 03:39:18.639785 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggnf2"] Feb 28 03:39:18 crc kubenswrapper[4819]: I0228 03:39:18.646095 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ggnf2"] Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.049103 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mg46"] Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.049494 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8mg46" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="registry-server" containerID="cri-o://8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74" gracePeriod=2 Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.589165 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.757567 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2h5f\" (UniqueName: \"kubernetes.io/projected/bb2d247d-3cce-4daa-9a51-20769f987756-kube-api-access-b2h5f\") pod \"bb2d247d-3cce-4daa-9a51-20769f987756\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.757642 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-catalog-content\") pod \"bb2d247d-3cce-4daa-9a51-20769f987756\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.757664 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-utilities\") pod \"bb2d247d-3cce-4daa-9a51-20769f987756\" (UID: \"bb2d247d-3cce-4daa-9a51-20769f987756\") " Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.758818 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-utilities" (OuterVolumeSpecName: "utilities") pod "bb2d247d-3cce-4daa-9a51-20769f987756" (UID: "bb2d247d-3cce-4daa-9a51-20769f987756"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.768360 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2d247d-3cce-4daa-9a51-20769f987756-kube-api-access-b2h5f" (OuterVolumeSpecName: "kube-api-access-b2h5f") pod "bb2d247d-3cce-4daa-9a51-20769f987756" (UID: "bb2d247d-3cce-4daa-9a51-20769f987756"). InnerVolumeSpecName "kube-api-access-b2h5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.812956 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb2d247d-3cce-4daa-9a51-20769f987756" (UID: "bb2d247d-3cce-4daa-9a51-20769f987756"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.858850 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2h5f\" (UniqueName: \"kubernetes.io/projected/bb2d247d-3cce-4daa-9a51-20769f987756-kube-api-access-b2h5f\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.858893 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:19 crc kubenswrapper[4819]: I0228 03:39:19.858906 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb2d247d-3cce-4daa-9a51-20769f987756-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.173145 4819 generic.go:334] "Generic (PLEG): container finished" podID="bb2d247d-3cce-4daa-9a51-20769f987756" containerID="8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74" exitCode=0 Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.173198 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerDied","Data":"8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74"} Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.173221 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mg46" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.173245 4819 scope.go:117] "RemoveContainer" containerID="8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.173229 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mg46" event={"ID":"bb2d247d-3cce-4daa-9a51-20769f987756","Type":"ContainerDied","Data":"7c9994d76d11029219146aae5c02a04a39ee45105bb4ec5d89b8b89937c33871"} Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.188815 4819 scope.go:117] "RemoveContainer" containerID="0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.200927 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mg46"] Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.213591 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8mg46"] Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.218588 4819 scope.go:117] "RemoveContainer" containerID="e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.238704 4819 scope.go:117] "RemoveContainer" containerID="8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74" Feb 28 03:39:20 crc kubenswrapper[4819]: E0228 03:39:20.239088 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74\": container with ID starting with 8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74 not found: ID does not exist" containerID="8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.239124 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74"} err="failed to get container status \"8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74\": rpc error: code = NotFound desc = could not find container \"8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74\": container with ID starting with 8e1019ed963cd52946c8f505da441517e58335d05064b89891c3c949baf1ed74 not found: ID does not exist" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.239149 4819 scope.go:117] "RemoveContainer" containerID="0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3" Feb 28 03:39:20 crc kubenswrapper[4819]: E0228 03:39:20.239400 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3\": container with ID starting with 0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3 not found: ID does not exist" containerID="0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.239421 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3"} err="failed to get container status \"0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3\": rpc error: code = NotFound desc = could not find container \"0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3\": container with ID starting with 0f049a2b060b01451b8a609604d7aa4ece1453bd2f6efec65f9bc6133e2a40c3 not found: ID does not exist" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.239437 4819 scope.go:117] "RemoveContainer" containerID="e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b" Feb 28 03:39:20 crc kubenswrapper[4819]: E0228 03:39:20.239685 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b\": container with ID starting with e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b not found: ID does not exist" containerID="e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.239709 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b"} err="failed to get container status \"e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b\": rpc error: code = NotFound desc = could not find container \"e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b\": container with ID starting with e8b2633bf0e9bda7b950fddff07ae525840cafdc4ed7e4eb40a815e32bf8311b not found: ID does not exist" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.381198 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" path="/var/lib/kubelet/pods/2b7e1686-948f-45a4-b739-5e88a4489b8e/volumes" Feb 28 03:39:20 crc kubenswrapper[4819]: I0228 03:39:20.382595 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" path="/var/lib/kubelet/pods/bb2d247d-3cce-4daa-9a51-20769f987756/volumes" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.104148 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-999696588-zvjvm"] Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.104596 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" podUID="a406ca7e-042c-425d-a88b-150b5f48fb3c" containerName="controller-manager" containerID="cri-o://470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16" gracePeriod=30 Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.218491 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f"] Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.218939 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" podUID="ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" containerName="route-controller-manager" containerID="cri-o://bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0" gracePeriod=30 Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.709666 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.759465 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.820420 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-client-ca\") pod \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.820465 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-serving-cert\") pod \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.820513 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5428s\" (UniqueName: \"kubernetes.io/projected/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-kube-api-access-5428s\") pod \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.820565 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-config\") pod \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\" (UID: \"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.821451 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-config" (OuterVolumeSpecName: "config") pod "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" (UID: "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.821831 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-client-ca" (OuterVolumeSpecName: "client-ca") pod "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" (UID: "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.826213 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-kube-api-access-5428s" (OuterVolumeSpecName: "kube-api-access-5428s") pod "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" (UID: "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e"). InnerVolumeSpecName "kube-api-access-5428s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.826277 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" (UID: "ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922154 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-config\") pod \"a406ca7e-042c-425d-a88b-150b5f48fb3c\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922221 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-proxy-ca-bundles\") pod \"a406ca7e-042c-425d-a88b-150b5f48fb3c\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922283 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwdcf\" (UniqueName: \"kubernetes.io/projected/a406ca7e-042c-425d-a88b-150b5f48fb3c-kube-api-access-wwdcf\") pod \"a406ca7e-042c-425d-a88b-150b5f48fb3c\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922326 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-client-ca\") pod \"a406ca7e-042c-425d-a88b-150b5f48fb3c\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922382 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a406ca7e-042c-425d-a88b-150b5f48fb3c-serving-cert\") pod \"a406ca7e-042c-425d-a88b-150b5f48fb3c\" (UID: \"a406ca7e-042c-425d-a88b-150b5f48fb3c\") " Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922763 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5428s\" (UniqueName: \"kubernetes.io/projected/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-kube-api-access-5428s\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922795 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922814 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.922831 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.923196 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-client-ca" (OuterVolumeSpecName: "client-ca") pod "a406ca7e-042c-425d-a88b-150b5f48fb3c" (UID: "a406ca7e-042c-425d-a88b-150b5f48fb3c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.923274 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a406ca7e-042c-425d-a88b-150b5f48fb3c" (UID: "a406ca7e-042c-425d-a88b-150b5f48fb3c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.923461 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-config" (OuterVolumeSpecName: "config") pod "a406ca7e-042c-425d-a88b-150b5f48fb3c" (UID: "a406ca7e-042c-425d-a88b-150b5f48fb3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.928146 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a406ca7e-042c-425d-a88b-150b5f48fb3c-kube-api-access-wwdcf" (OuterVolumeSpecName: "kube-api-access-wwdcf") pod "a406ca7e-042c-425d-a88b-150b5f48fb3c" (UID: "a406ca7e-042c-425d-a88b-150b5f48fb3c"). InnerVolumeSpecName "kube-api-access-wwdcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:23 crc kubenswrapper[4819]: I0228 03:39:23.928163 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a406ca7e-042c-425d-a88b-150b5f48fb3c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a406ca7e-042c-425d-a88b-150b5f48fb3c" (UID: "a406ca7e-042c-425d-a88b-150b5f48fb3c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.024724 4819 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.024771 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwdcf\" (UniqueName: \"kubernetes.io/projected/a406ca7e-042c-425d-a88b-150b5f48fb3c-kube-api-access-wwdcf\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.024795 4819 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.024813 4819 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a406ca7e-042c-425d-a88b-150b5f48fb3c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.024831 4819 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a406ca7e-042c-425d-a88b-150b5f48fb3c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.198557 4819 generic.go:334] "Generic (PLEG): container finished" podID="ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" containerID="bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0" exitCode=0 Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.198649 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.198680 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" event={"ID":"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e","Type":"ContainerDied","Data":"bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0"} Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.198720 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f" event={"ID":"ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e","Type":"ContainerDied","Data":"864786dfcf2489ad304c73ab0167aca87395f038d7129384c7ea9e8d24401358"} Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.198747 4819 scope.go:117] "RemoveContainer" containerID="bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.201921 4819 generic.go:334] "Generic (PLEG): container finished" podID="a406ca7e-042c-425d-a88b-150b5f48fb3c" containerID="470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16" exitCode=0 Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.201976 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" event={"ID":"a406ca7e-042c-425d-a88b-150b5f48fb3c","Type":"ContainerDied","Data":"470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16"} Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.202008 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.202014 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-999696588-zvjvm" event={"ID":"a406ca7e-042c-425d-a88b-150b5f48fb3c","Type":"ContainerDied","Data":"10ba1209cf0438d24659299f526c23bc572aab94261f82581feaa84b771ccc92"} Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.266320 4819 scope.go:117] "RemoveContainer" containerID="bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0" Feb 28 03:39:24 crc kubenswrapper[4819]: E0228 03:39:24.268781 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0\": container with ID starting with bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0 not found: ID does not exist" containerID="bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.268875 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0"} err="failed to get container status \"bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0\": rpc error: code = NotFound desc = could not find container \"bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0\": container with ID starting with bedacddf73649ec450530e9d71b98a4ef098843cb4ee176f5741c4bb4769f6a0 not found: ID does not exist" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.268916 4819 scope.go:117] "RemoveContainer" containerID="470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.281687 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-999696588-zvjvm"] Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.291693 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-999696588-zvjvm"] Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.299181 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f"] Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.304186 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fd55b8b89-qfc4f"] Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.309348 4819 scope.go:117] "RemoveContainer" containerID="470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16" Feb 28 03:39:24 crc kubenswrapper[4819]: E0228 03:39:24.310426 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16\": container with ID starting with 470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16 not found: ID does not exist" containerID="470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.310492 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16"} err="failed to get container status \"470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16\": rpc error: code = NotFound desc = could not find container \"470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16\": container with ID starting with 470b74872760d7b3710208b9721beab784dea0ca254f443f1a5c8683a2124b16 not found: ID does not exist" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.381044 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a406ca7e-042c-425d-a88b-150b5f48fb3c" path="/var/lib/kubelet/pods/a406ca7e-042c-425d-a88b-150b5f48fb3c/volumes" Feb 28 03:39:24 crc kubenswrapper[4819]: I0228 03:39:24.382148 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" path="/var/lib/kubelet/pods/ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e/volumes" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.115695 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54bff9fc78-j6bfs"] Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116172 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116184 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116202 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="extract-utilities" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116209 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="extract-utilities" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116215 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="extract-content" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116222 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="extract-content" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116229 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a406ca7e-042c-425d-a88b-150b5f48fb3c" containerName="controller-manager" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116236 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a406ca7e-042c-425d-a88b-150b5f48fb3c" containerName="controller-manager" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116261 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="extract-utilities" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116272 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="extract-utilities" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116283 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" containerName="route-controller-manager" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116289 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" containerName="route-controller-manager" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116297 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116303 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116312 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="extract-utilities" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116318 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="extract-utilities" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116324 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="extract-content" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116330 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="extract-content" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116337 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116342 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: E0228 03:39:25.116352 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="extract-content" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116359 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="extract-content" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116466 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a406ca7e-042c-425d-a88b-150b5f48fb3c" containerName="controller-manager" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116475 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f8a9ba-33dd-4c1b-8383-6a340bb82292" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116483 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2d247d-3cce-4daa-9a51-20769f987756" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116492 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd07e9b-cf88-4421-bfa6-4a9a0ff5749e" containerName="route-controller-manager" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116499 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7e1686-948f-45a4-b739-5e88a4489b8e" containerName="registry-server" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.116841 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.119425 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw"] Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.119695 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.120404 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.120458 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.120732 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.120846 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.125932 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.126015 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.133238 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.133286 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.133430 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.133625 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.134398 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.134455 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.139657 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw"] Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.142656 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.183167 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54bff9fc78-j6bfs"] Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241060 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5505f7-9762-4120-b295-fd6d19803f1c-config\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241121 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c5505f7-9762-4120-b295-fd6d19803f1c-serving-cert\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241265 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-proxy-ca-bundles\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241377 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-config\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241406 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-serving-cert\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241430 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krcwm\" (UniqueName: \"kubernetes.io/projected/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-kube-api-access-krcwm\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241461 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5505f7-9762-4120-b295-fd6d19803f1c-client-ca\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241490 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-client-ca\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.241612 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qn4\" (UniqueName: \"kubernetes.io/projected/9c5505f7-9762-4120-b295-fd6d19803f1c-kube-api-access-c9qn4\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342412 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c5505f7-9762-4120-b295-fd6d19803f1c-serving-cert\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342467 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-proxy-ca-bundles\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342502 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-config\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342522 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-serving-cert\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342544 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krcwm\" (UniqueName: \"kubernetes.io/projected/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-kube-api-access-krcwm\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342568 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5505f7-9762-4120-b295-fd6d19803f1c-client-ca\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342589 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-client-ca\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342617 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qn4\" (UniqueName: \"kubernetes.io/projected/9c5505f7-9762-4120-b295-fd6d19803f1c-kube-api-access-c9qn4\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.342666 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5505f7-9762-4120-b295-fd6d19803f1c-config\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.343753 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-proxy-ca-bundles\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.344061 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5505f7-9762-4120-b295-fd6d19803f1c-config\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.345308 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-config\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.345485 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-client-ca\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.345929 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c5505f7-9762-4120-b295-fd6d19803f1c-client-ca\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.351645 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-serving-cert\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.352031 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c5505f7-9762-4120-b295-fd6d19803f1c-serving-cert\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.366309 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krcwm\" (UniqueName: \"kubernetes.io/projected/0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75-kube-api-access-krcwm\") pod \"controller-manager-54bff9fc78-j6bfs\" (UID: \"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75\") " pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.381846 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qn4\" (UniqueName: \"kubernetes.io/projected/9c5505f7-9762-4120-b295-fd6d19803f1c-kube-api-access-c9qn4\") pod \"route-controller-manager-88b9fd9f5-wfjdw\" (UID: \"9c5505f7-9762-4120-b295-fd6d19803f1c\") " pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.444512 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.459863 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.894607 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54bff9fc78-j6bfs"] Feb 28 03:39:25 crc kubenswrapper[4819]: I0228 03:39:25.955505 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw"] Feb 28 03:39:25 crc kubenswrapper[4819]: W0228 03:39:25.962499 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5505f7_9762_4120_b295_fd6d19803f1c.slice/crio-bf367ebbbe3cfc5478903a039ccf27e1d20e89cf3ac131a1a685454c38eda1eb WatchSource:0}: Error finding container bf367ebbbe3cfc5478903a039ccf27e1d20e89cf3ac131a1a685454c38eda1eb: Status 404 returned error can't find the container with id bf367ebbbe3cfc5478903a039ccf27e1d20e89cf3ac131a1a685454c38eda1eb Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.217588 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" event={"ID":"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75","Type":"ContainerStarted","Data":"a252bca40ea5747639c6daebe3c5339bc9340910a122ef753bf5e946dbcebcc5"} Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.218072 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" event={"ID":"0cfe6ba1-95d2-4b7d-b69f-7ad5884e0b75","Type":"ContainerStarted","Data":"d4574cf0676d9ed2e1cea3018fba735e17552e931bbdfe7190a04f33379d6a1a"} Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.219097 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.220613 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" event={"ID":"9c5505f7-9762-4120-b295-fd6d19803f1c","Type":"ContainerStarted","Data":"7a3a78f04aa5cae19e09f78ef01b2778ccee9a893670759f4e54ad7461d10890"} Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.220636 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" event={"ID":"9c5505f7-9762-4120-b295-fd6d19803f1c","Type":"ContainerStarted","Data":"bf367ebbbe3cfc5478903a039ccf27e1d20e89cf3ac131a1a685454c38eda1eb"} Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.220890 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.229725 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.245658 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54bff9fc78-j6bfs" podStartSLOduration=3.245636375 podStartE2EDuration="3.245636375s" podCreationTimestamp="2026-02-28 03:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:39:26.243996334 +0000 UTC m=+304.709565192" watchObservedRunningTime="2026-02-28 03:39:26.245636375 +0000 UTC m=+304.711205233" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.283490 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" podStartSLOduration=3.283467919 podStartE2EDuration="3.283467919s" podCreationTimestamp="2026-02-28 03:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:39:26.263943372 +0000 UTC m=+304.729512230" watchObservedRunningTime="2026-02-28 03:39:26.283467919 +0000 UTC m=+304.749036777" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.515328 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-88b9fd9f5-wfjdw" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.555737 4819 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.556016 4819 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.556401 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553" gracePeriod=15 Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.556414 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb" gracePeriod=15 Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.556460 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9" gracePeriod=15 Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.556584 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad" gracePeriod=15 Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.556377 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c" gracePeriod=15 Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.563458 4819 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.563780 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.563844 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.563908 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.563961 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564122 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564199 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564298 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564352 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564417 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564469 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564523 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564572 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564631 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564687 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564738 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564786 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.564843 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.564897 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565031 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565098 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565149 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565204 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565285 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565364 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565424 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565484 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: E0228 03:39:26.565616 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565671 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.565813 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.567323 4819 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.568369 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.572394 4819 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.593784 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.593991 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.594085 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.594157 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.594226 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.594315 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.594386 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.594475 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.697825 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698034 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698305 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698426 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698711 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698770 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698795 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.698828 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.699182 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.699341 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.699288 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.699442 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.699915 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.699505 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.701001 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:26 crc kubenswrapper[4819]: I0228 03:39:26.701084 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.226442 4819 generic.go:334] "Generic (PLEG): container finished" podID="8da55153-bfac-4bce-8c0b-2ad25d556549" containerID="0a943832223bbf0cc6819cf36f1155398918ffafdc28d4a25b1d908e5b4e5bd6" exitCode=0 Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.226523 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8da55153-bfac-4bce-8c0b-2ad25d556549","Type":"ContainerDied","Data":"0a943832223bbf0cc6819cf36f1155398918ffafdc28d4a25b1d908e5b4e5bd6"} Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.227503 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.228700 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.229923 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.230581 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb" exitCode=0 Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.230614 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9" exitCode=0 Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.230624 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553" exitCode=0 Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.230633 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad" exitCode=2 Feb 28 03:39:27 crc kubenswrapper[4819]: I0228 03:39:27.230676 4819 scope.go:117] "RemoveContainer" containerID="0e3d4593b744863670b03ac945a785b46e4264ccb2cd86d46805a95995ba70d6" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.255064 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.709701 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.710223 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.827736 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-kubelet-dir\") pod \"8da55153-bfac-4bce-8c0b-2ad25d556549\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.827942 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8da55153-bfac-4bce-8c0b-2ad25d556549-kube-api-access\") pod \"8da55153-bfac-4bce-8c0b-2ad25d556549\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.828061 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-var-lock\") pod \"8da55153-bfac-4bce-8c0b-2ad25d556549\" (UID: \"8da55153-bfac-4bce-8c0b-2ad25d556549\") " Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.827958 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8da55153-bfac-4bce-8c0b-2ad25d556549" (UID: "8da55153-bfac-4bce-8c0b-2ad25d556549"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.828272 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-var-lock" (OuterVolumeSpecName: "var-lock") pod "8da55153-bfac-4bce-8c0b-2ad25d556549" (UID: "8da55153-bfac-4bce-8c0b-2ad25d556549"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.828402 4819 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.828465 4819 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8da55153-bfac-4bce-8c0b-2ad25d556549-var-lock\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.836927 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da55153-bfac-4bce-8c0b-2ad25d556549-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8da55153-bfac-4bce-8c0b-2ad25d556549" (UID: "8da55153-bfac-4bce-8c0b-2ad25d556549"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.912376 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.913141 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.913664 4819 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.914295 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:28 crc kubenswrapper[4819]: I0228 03:39:28.930022 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8da55153-bfac-4bce-8c0b-2ad25d556549-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.031348 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.031463 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.031504 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.032025 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.032065 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.032343 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.133442 4819 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.133493 4819 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.133512 4819 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.268415 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.269996 4819 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c" exitCode=0 Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.270095 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.270148 4819 scope.go:117] "RemoveContainer" containerID="6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.273460 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8da55153-bfac-4bce-8c0b-2ad25d556549","Type":"ContainerDied","Data":"bceeb9aed333b9ecf850d28568521ec990e2b11fc531f7f7af93a30731d4c1c9"} Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.273504 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bceeb9aed333b9ecf850d28568521ec990e2b11fc531f7f7af93a30731d4c1c9" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.273531 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.304216 4819 scope.go:117] "RemoveContainer" containerID="72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.304775 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.305341 4819 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.305958 4819 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.306447 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.328609 4819 scope.go:117] "RemoveContainer" containerID="9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.343940 4819 scope.go:117] "RemoveContainer" containerID="3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.359474 4819 scope.go:117] "RemoveContainer" containerID="5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.374729 4819 scope.go:117] "RemoveContainer" containerID="8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.393433 4819 scope.go:117] "RemoveContainer" containerID="6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb" Feb 28 03:39:29 crc kubenswrapper[4819]: E0228 03:39:29.393929 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\": container with ID starting with 6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb not found: ID does not exist" containerID="6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.393961 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb"} err="failed to get container status \"6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\": rpc error: code = NotFound desc = could not find container \"6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb\": container with ID starting with 6e3ad47ded0f6779209982f86ef8e4dd2f9bce48a7330a408a918be469bbd0cb not found: ID does not exist" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.393981 4819 scope.go:117] "RemoveContainer" containerID="72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9" Feb 28 03:39:29 crc kubenswrapper[4819]: E0228 03:39:29.394329 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\": container with ID starting with 72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9 not found: ID does not exist" containerID="72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.394383 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9"} err="failed to get container status \"72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\": rpc error: code = NotFound desc = could not find container \"72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9\": container with ID starting with 72378e53e8546500f3d64cdfc91c6b7fb15d39ddd33595788c1e131fb5353ae9 not found: ID does not exist" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.394422 4819 scope.go:117] "RemoveContainer" containerID="9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553" Feb 28 03:39:29 crc kubenswrapper[4819]: E0228 03:39:29.394676 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\": container with ID starting with 9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553 not found: ID does not exist" containerID="9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.394699 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553"} err="failed to get container status \"9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\": rpc error: code = NotFound desc = could not find container \"9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553\": container with ID starting with 9641437b1f3d51c078c1b3c18f20ebce4822dd604daf212e90d19d640b688553 not found: ID does not exist" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.394715 4819 scope.go:117] "RemoveContainer" containerID="3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad" Feb 28 03:39:29 crc kubenswrapper[4819]: E0228 03:39:29.394952 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\": container with ID starting with 3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad not found: ID does not exist" containerID="3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.394985 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad"} err="failed to get container status \"3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\": rpc error: code = NotFound desc = could not find container \"3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad\": container with ID starting with 3b8264ec3d93c3b8f849434f53ad026378d05876de516b5c51be6e888d9280ad not found: ID does not exist" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.395005 4819 scope.go:117] "RemoveContainer" containerID="5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c" Feb 28 03:39:29 crc kubenswrapper[4819]: E0228 03:39:29.395234 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\": container with ID starting with 5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c not found: ID does not exist" containerID="5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.395683 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c"} err="failed to get container status \"5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\": rpc error: code = NotFound desc = could not find container \"5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c\": container with ID starting with 5932a7431fe68c77994ca6bc2c61b1f5ccd4904d8b4293ca07d50e378050103c not found: ID does not exist" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.395698 4819 scope.go:117] "RemoveContainer" containerID="8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355" Feb 28 03:39:29 crc kubenswrapper[4819]: E0228 03:39:29.395990 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\": container with ID starting with 8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355 not found: ID does not exist" containerID="8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355" Feb 28 03:39:29 crc kubenswrapper[4819]: I0228 03:39:29.396050 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355"} err="failed to get container status \"8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\": rpc error: code = NotFound desc = could not find container \"8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355\": container with ID starting with 8b9f042abafcdbb4209f1a2f73afbf871a53de1ca147ce4a36de96be1b2c5355 not found: ID does not exist" Feb 28 03:39:30 crc kubenswrapper[4819]: I0228 03:39:30.377459 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 28 03:39:30 crc kubenswrapper[4819]: I0228 03:39:30.834655 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:39:30 crc kubenswrapper[4819]: I0228 03:39:30.834728 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:39:30 crc kubenswrapper[4819]: I0228 03:39:30.834794 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:39:30 crc kubenswrapper[4819]: E0228 03:39:30.835935 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-rw4hn.18984be77481a776\": dial tcp 38.102.83.212:6443: connect: connection refused" event=< Feb 28 03:39:30 crc kubenswrapper[4819]: &Event{ObjectMeta:{machine-config-daemon-rw4hn.18984be77481a776 openshift-machine-config-operator 29497 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-rw4hn,UID:d6ad11c1-0eb7-4064-bb39-3ffb389efb90,APIVersion:v1,ResourceVersion:26691,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Feb 28 03:39:30 crc kubenswrapper[4819]: body: Feb 28 03:39:30 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:38:30 +0000 UTC,LastTimestamp:2026-02-28 03:39:30.834707506 +0000 UTC m=+309.300276404,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:39:30 crc kubenswrapper[4819]: > Feb 28 03:39:30 crc kubenswrapper[4819]: I0228 03:39:30.836151 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 03:39:30 crc kubenswrapper[4819]: I0228 03:39:30.836277 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af" gracePeriod=600 Feb 28 03:39:31 crc kubenswrapper[4819]: I0228 03:39:31.288848 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af" exitCode=0 Feb 28 03:39:31 crc kubenswrapper[4819]: I0228 03:39:31.288958 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af"} Feb 28 03:39:31 crc kubenswrapper[4819]: I0228 03:39:31.289278 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"2e1444d8a76cee2b7dbf599d5d429088251da1e9a8f9ace55b15b8ab10db4eaf"} Feb 28 03:39:31 crc kubenswrapper[4819]: I0228 03:39:31.290202 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:31 crc kubenswrapper[4819]: I0228 03:39:31.290713 4819 status_manager.go:851] "Failed to get status for pod" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-rw4hn\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:31 crc kubenswrapper[4819]: E0228 03:39:31.593547 4819 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:31 crc kubenswrapper[4819]: I0228 03:39:31.594438 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:31 crc kubenswrapper[4819]: W0228 03:39:31.623106 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-7a3d31c308dd09d0a6dd7e9c1c40021ec824fa8b84abb61f15a358bd4c70f67c WatchSource:0}: Error finding container 7a3d31c308dd09d0a6dd7e9c1c40021ec824fa8b84abb61f15a358bd4c70f67c: Status 404 returned error can't find the container with id 7a3d31c308dd09d0a6dd7e9c1c40021ec824fa8b84abb61f15a358bd4c70f67c Feb 28 03:39:32 crc kubenswrapper[4819]: I0228 03:39:32.298717 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03"} Feb 28 03:39:32 crc kubenswrapper[4819]: I0228 03:39:32.299141 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7a3d31c308dd09d0a6dd7e9c1c40021ec824fa8b84abb61f15a358bd4c70f67c"} Feb 28 03:39:32 crc kubenswrapper[4819]: I0228 03:39:32.300049 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:32 crc kubenswrapper[4819]: E0228 03:39:32.300051 4819 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:32 crc kubenswrapper[4819]: I0228 03:39:32.300536 4819 status_manager.go:851] "Failed to get status for pod" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-rw4hn\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:32 crc kubenswrapper[4819]: I0228 03:39:32.377994 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:32 crc kubenswrapper[4819]: I0228 03:39:32.378683 4819 status_manager.go:851] "Failed to get status for pod" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-rw4hn\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:33 crc kubenswrapper[4819]: E0228 03:39:33.306566 4819 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.072967 4819 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.073829 4819 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.074602 4819 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.075305 4819 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.075868 4819 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:34 crc kubenswrapper[4819]: I0228 03:39:34.075920 4819 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.076352 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.277347 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Feb 28 03:39:34 crc kubenswrapper[4819]: E0228 03:39:34.679328 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Feb 28 03:39:35 crc kubenswrapper[4819]: E0228 03:39:35.480764 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Feb 28 03:39:37 crc kubenswrapper[4819]: E0228 03:39:37.053676 4819 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-rw4hn.18984be77481a776\": dial tcp 38.102.83.212:6443: connect: connection refused" event=< Feb 28 03:39:37 crc kubenswrapper[4819]: &Event{ObjectMeta:{machine-config-daemon-rw4hn.18984be77481a776 openshift-machine-config-operator 29497 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-rw4hn,UID:d6ad11c1-0eb7-4064-bb39-3ffb389efb90,APIVersion:v1,ResourceVersion:26691,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Feb 28 03:39:37 crc kubenswrapper[4819]: body: Feb 28 03:39:37 crc kubenswrapper[4819]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-28 03:38:30 +0000 UTC,LastTimestamp:2026-02-28 03:39:30.834707506 +0000 UTC m=+309.300276404,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 28 03:39:37 crc kubenswrapper[4819]: > Feb 28 03:39:37 crc kubenswrapper[4819]: E0228 03:39:37.082387 4819 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Feb 28 03:39:38 crc kubenswrapper[4819]: I0228 03:39:38.368424 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:38 crc kubenswrapper[4819]: I0228 03:39:38.370752 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:38 crc kubenswrapper[4819]: I0228 03:39:38.371122 4819 status_manager.go:851] "Failed to get status for pod" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-rw4hn\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:38 crc kubenswrapper[4819]: I0228 03:39:38.403594 4819 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:38 crc kubenswrapper[4819]: I0228 03:39:38.403649 4819 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:38 crc kubenswrapper[4819]: E0228 03:39:38.404346 4819 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:38 crc kubenswrapper[4819]: I0228 03:39:38.405147 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:38 crc kubenswrapper[4819]: W0228 03:39:38.444448 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-863c8182a7ad8a883ed39065e6cc619d6b4db1d49e9477fb1388d6232d2a6c26 WatchSource:0}: Error finding container 863c8182a7ad8a883ed39065e6cc619d6b4db1d49e9477fb1388d6232d2a6c26: Status 404 returned error can't find the container with id 863c8182a7ad8a883ed39065e6cc619d6b4db1d49e9477fb1388d6232d2a6c26 Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.344980 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.347614 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.347657 4819 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5" exitCode=1 Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.347714 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5"} Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.348130 4819 scope.go:117] "RemoveContainer" containerID="67db3bae0b60db8f41b5448a1b29d377d320b668f3aaebfcaec593d99c8849e5" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.348506 4819 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.349971 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.350716 4819 status_manager.go:851] "Failed to get status for pod" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-rw4hn\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.351181 4819 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4760d11ecda86ec51b5a746b325038b4d9d14b108ad5e04a0b46183b2dba2c1e" exitCode=0 Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.351212 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4760d11ecda86ec51b5a746b325038b4d9d14b108ad5e04a0b46183b2dba2c1e"} Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.351233 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"863c8182a7ad8a883ed39065e6cc619d6b4db1d49e9477fb1388d6232d2a6c26"} Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.351471 4819 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.351489 4819 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:39 crc kubenswrapper[4819]: E0228 03:39:39.351701 4819 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.354480 4819 status_manager.go:851] "Failed to get status for pod" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-rw4hn\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.362936 4819 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:39 crc kubenswrapper[4819]: I0228 03:39:39.363711 4819 status_manager.go:851] "Failed to get status for pod" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.359034 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.360209 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.360319 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5b67e3ff67e7f95396d9f566222bad81815aeff5f6f2fa6f5e6faef997e6edcb"} Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.364410 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e135c7767246eaa6056059ddae17e7928ae3a06983dc0260b41481f01320054a"} Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.364451 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ff31f21ed0ec8ae893bf1836bb44322ad14e57e975fd318bca9b8ee32f6cc208"} Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.364461 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"090e1dd42ae256ded83eda724f8fba012c078068f4e93e55ef2d58b0a37550bb"} Feb 28 03:39:40 crc kubenswrapper[4819]: I0228 03:39:40.944964 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerName="oauth-openshift" containerID="cri-o://da5d61713f53a5a3908715f6665dae231eeefcefe7344c1daf71ef4cf8a4d508" gracePeriod=15 Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.374727 4819 generic.go:334] "Generic (PLEG): container finished" podID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerID="da5d61713f53a5a3908715f6665dae231eeefcefe7344c1daf71ef4cf8a4d508" exitCode=0 Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.374859 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" event={"ID":"f3590dc7-98a1-45cf-a420-f045d5d38335","Type":"ContainerDied","Data":"da5d61713f53a5a3908715f6665dae231eeefcefe7344c1daf71ef4cf8a4d508"} Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.378974 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25c6fcfb792e1fc6f88c30e55267c5299b31e80e9361827fb4f7c3e3e7b86828"} Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.379002 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d7fc6c435d42618e19e84a3021341c17fd6797e55a581ac380ccce99062b6c4d"} Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.379275 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.379458 4819 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.379492 4819 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.436451 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.502691 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-login\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.502803 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-service-ca\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.502860 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-dir\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.502908 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-session\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.502941 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-cliconfig\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.502978 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd9sd\" (UniqueName: \"kubernetes.io/projected/f3590dc7-98a1-45cf-a420-f045d5d38335-kube-api-access-zd9sd\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.503007 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-policies\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.503023 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.503625 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.503675 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.503716 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504032 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-ocp-branding-template\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504078 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-router-certs\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504114 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-serving-cert\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504143 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-idp-0-file-data\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504179 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-trusted-ca-bundle\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504222 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-error\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504308 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-provider-selection\") pod \"f3590dc7-98a1-45cf-a420-f045d5d38335\" (UID: \"f3590dc7-98a1-45cf-a420-f045d5d38335\") " Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504605 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504634 4819 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504653 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.504672 4819 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.509553 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.516665 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.518578 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.518871 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.518875 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3590dc7-98a1-45cf-a420-f045d5d38335-kube-api-access-zd9sd" (OuterVolumeSpecName: "kube-api-access-zd9sd") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "kube-api-access-zd9sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.519538 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.519730 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.519967 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.528916 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.529033 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f3590dc7-98a1-45cf-a420-f045d5d38335" (UID: "f3590dc7-98a1-45cf-a420-f045d5d38335"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.605886 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.605943 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.605975 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606005 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606033 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606054 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606074 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606093 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd9sd\" (UniqueName: \"kubernetes.io/projected/f3590dc7-98a1-45cf-a420-f045d5d38335-kube-api-access-zd9sd\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606111 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:41 crc kubenswrapper[4819]: I0228 03:39:41.606131 4819 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3590dc7-98a1-45cf-a420-f045d5d38335-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:39:42 crc kubenswrapper[4819]: I0228 03:39:42.386415 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" event={"ID":"f3590dc7-98a1-45cf-a420-f045d5d38335","Type":"ContainerDied","Data":"2539baa888cd9b9f9a34f873624859bb0d7ba333ca3c6140efaa649ffd477209"} Feb 28 03:39:42 crc kubenswrapper[4819]: I0228 03:39:42.386495 4819 scope.go:117] "RemoveContainer" containerID="da5d61713f53a5a3908715f6665dae231eeefcefe7344c1daf71ef4cf8a4d508" Feb 28 03:39:42 crc kubenswrapper[4819]: I0228 03:39:42.386494 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w5fln" Feb 28 03:39:43 crc kubenswrapper[4819]: I0228 03:39:43.405206 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:43 crc kubenswrapper[4819]: I0228 03:39:43.405662 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:43 crc kubenswrapper[4819]: I0228 03:39:43.411758 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.270683 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.275534 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.388056 4819 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.418676 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.418865 4819 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.418904 4819 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.423043 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:39:46 crc kubenswrapper[4819]: I0228 03:39:46.425762 4819 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4088a52c-7485-495e-815b-d668e712f68e" Feb 28 03:39:47 crc kubenswrapper[4819]: I0228 03:39:47.425911 4819 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:47 crc kubenswrapper[4819]: I0228 03:39:47.426470 4819 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:39:52 crc kubenswrapper[4819]: I0228 03:39:52.400822 4819 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4088a52c-7485-495e-815b-d668e712f68e" Feb 28 03:39:57 crc kubenswrapper[4819]: I0228 03:39:57.084405 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 03:39:57 crc kubenswrapper[4819]: I0228 03:39:57.161041 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 28 03:39:57 crc kubenswrapper[4819]: I0228 03:39:57.281417 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 28 03:39:57 crc kubenswrapper[4819]: I0228 03:39:57.971976 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.013977 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.024293 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.259313 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.280082 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.316183 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.319701 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.397398 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.611841 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.633625 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.702373 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 28 03:39:58 crc kubenswrapper[4819]: I0228 03:39:58.960336 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.080913 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.155410 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.173456 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.195737 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.228151 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.234397 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.325854 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.533423 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.545237 4819 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.573957 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.585933 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.619349 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.653537 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.735875 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.853345 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.900059 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 28 03:39:59 crc kubenswrapper[4819]: I0228 03:39:59.950911 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.122996 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.163819 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.192783 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.271182 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.470304 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.569532 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.645105 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.730756 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.763716 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.874071 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.906736 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.983787 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 28 03:40:00 crc kubenswrapper[4819]: I0228 03:40:00.998193 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.039104 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.065572 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.068263 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.115362 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.215543 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.331387 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.413134 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.606639 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.655360 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.883686 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.893953 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.916917 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.926757 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 28 03:40:01 crc kubenswrapper[4819]: I0228 03:40:01.976044 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.084407 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.139827 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.158720 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.217739 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.231187 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.237138 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.497397 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.497871 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.517878 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.587887 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.650613 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 28 03:40:02 crc kubenswrapper[4819]: I0228 03:40:02.998770 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.033971 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.124628 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.174121 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.211110 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.379822 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.571273 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.640278 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.705509 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 28 03:40:03 crc kubenswrapper[4819]: I0228 03:40:03.854596 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.208858 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.288715 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.294782 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.358537 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.393859 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.403558 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.434453 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.515883 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.558523 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.598229 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.790091 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.815427 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 28 03:40:04 crc kubenswrapper[4819]: I0228 03:40:04.979897 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.075788 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.125871 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.126838 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.163656 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.193866 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.262783 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.401527 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.401812 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.569083 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.581291 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.663433 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.793560 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.800204 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.984945 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 28 03:40:05 crc kubenswrapper[4819]: I0228 03:40:05.996949 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.015157 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.085336 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.088799 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.144133 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.155638 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.176451 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.191445 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.321468 4819 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.374440 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.562476 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.609223 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.635186 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.688710 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.688947 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.718679 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.748122 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.765844 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.781900 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.885053 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 28 03:40:06 crc kubenswrapper[4819]: I0228 03:40:06.942202 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.014570 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.127748 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.172877 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.201517 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.275321 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.276118 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.299374 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.309051 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.315761 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.354947 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.362077 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.411962 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.551911 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.587012 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.739610 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.753076 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.756743 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.773230 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.782042 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.787043 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.816550 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.829627 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.829810 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.949788 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 28 03:40:07 crc kubenswrapper[4819]: I0228 03:40:07.989981 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.092527 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.115976 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.193708 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.207171 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.215729 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.239793 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.252155 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.298267 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.361743 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.395300 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.542427 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.601507 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.809354 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.810450 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.834127 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.834199 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.871648 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.875852 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.910624 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.925543 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 28 03:40:08 crc kubenswrapper[4819]: I0228 03:40:08.967648 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.135922 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.314187 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.357945 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.383139 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.435651 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.503575 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.519740 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.544825 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.596718 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.597899 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.601130 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.632767 4819 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.634988 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.652797 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.667725 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.731459 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 28 03:40:09 crc kubenswrapper[4819]: I0228 03:40:09.884988 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.069440 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.095102 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.101368 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.111618 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.114889 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.465788 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.649361 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.782580 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.796496 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.921905 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.946827 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 28 03:40:10 crc kubenswrapper[4819]: I0228 03:40:10.952657 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.015390 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.031864 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.202666 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.331376 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.390664 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.394855 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.444457 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.522213 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.757774 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.760452 4819 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.761393 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.829603 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 28 03:40:11 crc kubenswrapper[4819]: I0228 03:40:11.934020 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.002269 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.016288 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.083124 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.180309 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.608053 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.647445 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.693948 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.800977 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.857720 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 28 03:40:12 crc kubenswrapper[4819]: I0228 03:40:12.935440 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.192998 4819 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.200239 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-w5fln"] Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.200381 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29537500-dnvc5","openshift-authentication/oauth-openshift-57ccb4dddc-tn56m"] Feb 28 03:40:13 crc kubenswrapper[4819]: E0228 03:40:13.200653 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" containerName="installer" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.200681 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" containerName="installer" Feb 28 03:40:13 crc kubenswrapper[4819]: E0228 03:40:13.200711 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerName="oauth-openshift" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.200727 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerName="oauth-openshift" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.201058 4819 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.201111 4819 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="036831af-028e-4d9b-913c-15cb3632eb9f" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.201172 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da55153-bfac-4bce-8c0b-2ad25d556549" containerName="installer" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.201212 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" containerName="oauth-openshift" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.202133 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.203160 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.210825 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.211142 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.211417 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.211427 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.211908 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.211945 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.212229 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.212605 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.212919 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.213066 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.213147 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.212946 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.213237 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.213796 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.215400 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.222471 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.225440 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.230886 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.238212 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.247514 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=27.247486475 podStartE2EDuration="27.247486475s" podCreationTimestamp="2026-02-28 03:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:40:13.242530161 +0000 UTC m=+351.708099099" watchObservedRunningTime="2026-02-28 03:40:13.247486475 +0000 UTC m=+351.713055363" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.303795 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.304923 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374312 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-audit-dir\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374380 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374419 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxtl\" (UniqueName: \"kubernetes.io/projected/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf-kube-api-access-tqxtl\") pod \"auto-csr-approver-29537500-dnvc5\" (UID: \"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf\") " pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374647 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-service-ca\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374720 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-session\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374758 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.374832 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.375420 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.375619 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-login\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.375783 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.375932 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-router-certs\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.376072 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.376220 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqzq\" (UniqueName: \"kubernetes.io/projected/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-kube-api-access-jkqzq\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.376441 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-error\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.376599 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-audit-policies\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477274 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477333 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-login\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477364 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477384 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-router-certs\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477410 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477429 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqzq\" (UniqueName: \"kubernetes.io/projected/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-kube-api-access-jkqzq\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477456 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-error\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477483 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-audit-policies\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477517 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-audit-dir\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477538 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477570 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxtl\" (UniqueName: \"kubernetes.io/projected/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf-kube-api-access-tqxtl\") pod \"auto-csr-approver-29537500-dnvc5\" (UID: \"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf\") " pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477607 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-session\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477628 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-service-ca\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477652 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.477687 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.478053 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-audit-dir\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.478812 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.479735 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-audit-policies\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.479788 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.480161 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-service-ca\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.487651 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.487736 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.488467 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.488659 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.488822 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-router-certs\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.490387 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-login\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.490739 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-system-session\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.491165 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-v4-0-config-user-template-error\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.506105 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqzq\" (UniqueName: \"kubernetes.io/projected/8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c-kube-api-access-jkqzq\") pod \"oauth-openshift-57ccb4dddc-tn56m\" (UID: \"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.512231 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxtl\" (UniqueName: \"kubernetes.io/projected/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf-kube-api-access-tqxtl\") pod \"auto-csr-approver-29537500-dnvc5\" (UID: \"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf\") " pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.546593 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.560301 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:13 crc kubenswrapper[4819]: I0228 03:40:13.664885 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.039478 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537500-dnvc5"] Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.099395 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57ccb4dddc-tn56m"] Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.135600 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.270168 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.385185 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3590dc7-98a1-45cf-a420-f045d5d38335" path="/var/lib/kubelet/pods/f3590dc7-98a1-45cf-a420-f045d5d38335/volumes" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.605797 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" event={"ID":"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c","Type":"ContainerStarted","Data":"f13c3aeaa6711c908c5bab5beba355e3a99e0afd6f4f16b2e39e17bd085be8aa"} Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.606096 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.606110 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" event={"ID":"8d4d2f3e-6cea-47c8-97b2-7ec1a0e7a77c","Type":"ContainerStarted","Data":"ac70bb2294b04f497330ff2692208fb539666b11f93e0eb18c37cc6c544f84f8"} Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.608191 4819 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.608851 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" event={"ID":"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf","Type":"ContainerStarted","Data":"153db48fed242197170c070a4108e507ad93f859bedc59250495bcb68f27acac"} Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.697445 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.748443 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 28 03:40:14 crc kubenswrapper[4819]: I0228 03:40:14.873828 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 28 03:40:15 crc kubenswrapper[4819]: I0228 03:40:15.119103 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" Feb 28 03:40:15 crc kubenswrapper[4819]: I0228 03:40:15.144634 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57ccb4dddc-tn56m" podStartSLOduration=60.144612822 podStartE2EDuration="1m0.144612822s" podCreationTimestamp="2026-02-28 03:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:40:14.637700871 +0000 UTC m=+353.103269739" watchObservedRunningTime="2026-02-28 03:40:15.144612822 +0000 UTC m=+353.610181680" Feb 28 03:40:15 crc kubenswrapper[4819]: I0228 03:40:15.280531 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 28 03:40:15 crc kubenswrapper[4819]: I0228 03:40:15.615990 4819 generic.go:334] "Generic (PLEG): container finished" podID="4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf" containerID="0dc197adab1f232babb74084e1a0fec8e9c06953be31dbbc0e5e849f6234a129" exitCode=0 Feb 28 03:40:15 crc kubenswrapper[4819]: I0228 03:40:15.616066 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" event={"ID":"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf","Type":"ContainerDied","Data":"0dc197adab1f232babb74084e1a0fec8e9c06953be31dbbc0e5e849f6234a129"} Feb 28 03:40:16 crc kubenswrapper[4819]: I0228 03:40:16.903777 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:16 crc kubenswrapper[4819]: I0228 03:40:16.928745 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqxtl\" (UniqueName: \"kubernetes.io/projected/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf-kube-api-access-tqxtl\") pod \"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf\" (UID: \"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf\") " Feb 28 03:40:16 crc kubenswrapper[4819]: I0228 03:40:16.936964 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf-kube-api-access-tqxtl" (OuterVolumeSpecName: "kube-api-access-tqxtl") pod "4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf" (UID: "4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf"). InnerVolumeSpecName "kube-api-access-tqxtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:40:17 crc kubenswrapper[4819]: I0228 03:40:17.030943 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqxtl\" (UniqueName: \"kubernetes.io/projected/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf-kube-api-access-tqxtl\") on node \"crc\" DevicePath \"\"" Feb 28 03:40:17 crc kubenswrapper[4819]: I0228 03:40:17.438902 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 28 03:40:17 crc kubenswrapper[4819]: I0228 03:40:17.641195 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" event={"ID":"4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf","Type":"ContainerDied","Data":"153db48fed242197170c070a4108e507ad93f859bedc59250495bcb68f27acac"} Feb 28 03:40:17 crc kubenswrapper[4819]: I0228 03:40:17.641280 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="153db48fed242197170c070a4108e507ad93f859bedc59250495bcb68f27acac" Feb 28 03:40:17 crc kubenswrapper[4819]: I0228 03:40:17.641312 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537500-dnvc5" Feb 28 03:40:20 crc kubenswrapper[4819]: I0228 03:40:20.106172 4819 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 28 03:40:20 crc kubenswrapper[4819]: I0228 03:40:20.107051 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03" gracePeriod=5 Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.682528 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.682926 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.705524 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.705591 4819 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03" exitCode=137 Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.705648 4819 scope.go:117] "RemoveContainer" containerID="42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.705812 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.724661 4819 scope.go:117] "RemoveContainer" containerID="42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03" Feb 28 03:40:25 crc kubenswrapper[4819]: E0228 03:40:25.725413 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03\": container with ID starting with 42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03 not found: ID does not exist" containerID="42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.725481 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03"} err="failed to get container status \"42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03\": rpc error: code = NotFound desc = could not find container \"42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03\": container with ID starting with 42b816905546642aa7de758e5e17fb1f2f3aacb3abf825cf0e0a38a6b0bf8e03 not found: ID does not exist" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.754565 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.754633 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.754685 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.754764 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.754801 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.756015 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.756089 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.756104 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.756178 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.765184 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.860094 4819 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.860146 4819 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.860167 4819 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.860188 4819 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 28 03:40:25 crc kubenswrapper[4819]: I0228 03:40:25.860205 4819 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 28 03:40:26 crc kubenswrapper[4819]: I0228 03:40:26.379789 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 28 03:40:34 crc kubenswrapper[4819]: I0228 03:40:34.807051 4819 generic.go:334] "Generic (PLEG): container finished" podID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerID="332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8" exitCode=0 Feb 28 03:40:34 crc kubenswrapper[4819]: I0228 03:40:34.807494 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" event={"ID":"0a08aebb-db7e-488c-b992-2286ba6c9fd0","Type":"ContainerDied","Data":"332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8"} Feb 28 03:40:34 crc kubenswrapper[4819]: I0228 03:40:34.808306 4819 scope.go:117] "RemoveContainer" containerID="332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8" Feb 28 03:40:35 crc kubenswrapper[4819]: I0228 03:40:35.817015 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" event={"ID":"0a08aebb-db7e-488c-b992-2286ba6c9fd0","Type":"ContainerStarted","Data":"1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611"} Feb 28 03:40:35 crc kubenswrapper[4819]: I0228 03:40:35.817485 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:40:35 crc kubenswrapper[4819]: I0228 03:40:35.819763 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.291130 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bkx4p"] Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.297057 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bkx4p" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="registry-server" containerID="cri-o://3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706" gracePeriod=30 Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.300060 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hnqz"] Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.300431 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hnqz" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="registry-server" containerID="cri-o://0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61" gracePeriod=30 Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.315387 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nb49n"] Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.315832 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" containerID="cri-o://1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611" gracePeriod=30 Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.322705 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgv4"] Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.323137 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fxgv4" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="registry-server" containerID="cri-o://f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7" gracePeriod=30 Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.343718 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhxpx"] Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.344289 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zhxpx" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="registry-server" containerID="cri-o://1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b" gracePeriod=30 Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.358797 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wnnt5"] Feb 28 03:41:32 crc kubenswrapper[4819]: E0228 03:41:32.359596 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.359616 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 03:41:32 crc kubenswrapper[4819]: E0228 03:41:32.359660 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf" containerName="oc" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.359669 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf" containerName="oc" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.359903 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf" containerName="oc" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.359928 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.360906 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.384092 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wnnt5"] Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.400979 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15512f8d-a53e-47cb-9b22-b8f8f410d65d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.401261 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/15512f8d-a53e-47cb-9b22-b8f8f410d65d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.401362 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdg8n\" (UniqueName: \"kubernetes.io/projected/15512f8d-a53e-47cb-9b22-b8f8f410d65d-kube-api-access-pdg8n\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.502257 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15512f8d-a53e-47cb-9b22-b8f8f410d65d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.502327 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/15512f8d-a53e-47cb-9b22-b8f8f410d65d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.502345 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdg8n\" (UniqueName: \"kubernetes.io/projected/15512f8d-a53e-47cb-9b22-b8f8f410d65d-kube-api-access-pdg8n\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.504011 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15512f8d-a53e-47cb-9b22-b8f8f410d65d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.508024 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/15512f8d-a53e-47cb-9b22-b8f8f410d65d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.517062 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdg8n\" (UniqueName: \"kubernetes.io/projected/15512f8d-a53e-47cb-9b22-b8f8f410d65d-kube-api-access-pdg8n\") pod \"marketplace-operator-79b997595-wnnt5\" (UID: \"15512f8d-a53e-47cb-9b22-b8f8f410d65d\") " pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.769821 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.777432 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.778917 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.786941 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.802601 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805807 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwwx8\" (UniqueName: \"kubernetes.io/projected/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-kube-api-access-wwwx8\") pod \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805859 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzsvq\" (UniqueName: \"kubernetes.io/projected/0a08aebb-db7e-488c-b992-2286ba6c9fd0-kube-api-access-mzsvq\") pod \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805877 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-operator-metrics\") pod \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805902 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-trusted-ca\") pod \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\" (UID: \"0a08aebb-db7e-488c-b992-2286ba6c9fd0\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805917 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-utilities\") pod \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805945 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-catalog-content\") pod \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805965 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-catalog-content\") pod \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.805988 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-utilities\") pod \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\" (UID: \"2b9aa27e-76e7-4507-bca7-0ee08ff3a968\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.806004 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8knm\" (UniqueName: \"kubernetes.io/projected/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-kube-api-access-l8knm\") pod \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\" (UID: \"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.806585 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0a08aebb-db7e-488c-b992-2286ba6c9fd0" (UID: "0a08aebb-db7e-488c-b992-2286ba6c9fd0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.807075 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-utilities" (OuterVolumeSpecName: "utilities") pod "8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" (UID: "8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.807208 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-utilities" (OuterVolumeSpecName: "utilities") pod "2b9aa27e-76e7-4507-bca7-0ee08ff3a968" (UID: "2b9aa27e-76e7-4507-bca7-0ee08ff3a968"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.809429 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-kube-api-access-wwwx8" (OuterVolumeSpecName: "kube-api-access-wwwx8") pod "2b9aa27e-76e7-4507-bca7-0ee08ff3a968" (UID: "2b9aa27e-76e7-4507-bca7-0ee08ff3a968"). InnerVolumeSpecName "kube-api-access-wwwx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.817680 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a08aebb-db7e-488c-b992-2286ba6c9fd0-kube-api-access-mzsvq" (OuterVolumeSpecName: "kube-api-access-mzsvq") pod "0a08aebb-db7e-488c-b992-2286ba6c9fd0" (UID: "0a08aebb-db7e-488c-b992-2286ba6c9fd0"). InnerVolumeSpecName "kube-api-access-mzsvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.822485 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-kube-api-access-l8knm" (OuterVolumeSpecName: "kube-api-access-l8knm") pod "8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" (UID: "8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8"). InnerVolumeSpecName "kube-api-access-l8knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.827352 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0a08aebb-db7e-488c-b992-2286ba6c9fd0" (UID: "0a08aebb-db7e-488c-b992-2286ba6c9fd0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.860599 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.868304 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b9aa27e-76e7-4507-bca7-0ee08ff3a968" (UID: "2b9aa27e-76e7-4507-bca7-0ee08ff3a968"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.886123 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" (UID: "8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906613 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-utilities\") pod \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906665 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-catalog-content\") pod \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906694 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49m2p\" (UniqueName: \"kubernetes.io/projected/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-kube-api-access-49m2p\") pod \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906711 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-catalog-content\") pod \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906727 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgh9m\" (UniqueName: \"kubernetes.io/projected/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-kube-api-access-qgh9m\") pod \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\" (UID: \"6ef2ca7d-0b4b-4efa-aaad-af4137689efa\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906763 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-utilities\") pod \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\" (UID: \"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a\") " Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906921 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906933 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906943 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906954 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8knm\" (UniqueName: \"kubernetes.io/projected/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-kube-api-access-l8knm\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906965 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwwx8\" (UniqueName: \"kubernetes.io/projected/2b9aa27e-76e7-4507-bca7-0ee08ff3a968-kube-api-access-wwwx8\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906975 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzsvq\" (UniqueName: \"kubernetes.io/projected/0a08aebb-db7e-488c-b992-2286ba6c9fd0-kube-api-access-mzsvq\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906984 4819 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.906993 4819 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a08aebb-db7e-488c-b992-2286ba6c9fd0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.907001 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.907659 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-utilities" (OuterVolumeSpecName: "utilities") pod "9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" (UID: "9fe9f0aa-6448-48d7-900d-a8d5646a1a6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.908122 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-utilities" (OuterVolumeSpecName: "utilities") pod "6ef2ca7d-0b4b-4efa-aaad-af4137689efa" (UID: "6ef2ca7d-0b4b-4efa-aaad-af4137689efa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.911544 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-kube-api-access-qgh9m" (OuterVolumeSpecName: "kube-api-access-qgh9m") pod "6ef2ca7d-0b4b-4efa-aaad-af4137689efa" (UID: "6ef2ca7d-0b4b-4efa-aaad-af4137689efa"). InnerVolumeSpecName "kube-api-access-qgh9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.911668 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-kube-api-access-49m2p" (OuterVolumeSpecName: "kube-api-access-49m2p") pod "9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" (UID: "9fe9f0aa-6448-48d7-900d-a8d5646a1a6a"). InnerVolumeSpecName "kube-api-access-49m2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:41:32 crc kubenswrapper[4819]: I0228 03:41:32.969943 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" (UID: "9fe9f0aa-6448-48d7-900d-a8d5646a1a6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.007872 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.007913 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49m2p\" (UniqueName: \"kubernetes.io/projected/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-kube-api-access-49m2p\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.007926 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.007938 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgh9m\" (UniqueName: \"kubernetes.io/projected/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-kube-api-access-qgh9m\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.007949 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.044514 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ef2ca7d-0b4b-4efa-aaad-af4137689efa" (UID: "6ef2ca7d-0b4b-4efa-aaad-af4137689efa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.109310 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef2ca7d-0b4b-4efa-aaad-af4137689efa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.226957 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wnnt5"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.312144 4819 generic.go:334] "Generic (PLEG): container finished" podID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerID="1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b" exitCode=0 Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.312227 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhxpx" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.312238 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerDied","Data":"1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.312303 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhxpx" event={"ID":"6ef2ca7d-0b4b-4efa-aaad-af4137689efa","Type":"ContainerDied","Data":"5aa9e141a13e996172a84d11aeccb114e93619f9d2b24b005805a73d0720f888"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.312328 4819 scope.go:117] "RemoveContainer" containerID="1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.315825 4819 generic.go:334] "Generic (PLEG): container finished" podID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerID="f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7" exitCode=0 Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.315920 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxgv4" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.316508 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgv4" event={"ID":"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8","Type":"ContainerDied","Data":"f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.316548 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgv4" event={"ID":"8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8","Type":"ContainerDied","Data":"26be1f4c2a5789c5be9e8df543e47179ac83da44b0ec19c4bc9c74fd395c74c8"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.320565 4819 generic.go:334] "Generic (PLEG): container finished" podID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerID="3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706" exitCode=0 Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.320679 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkx4p" event={"ID":"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a","Type":"ContainerDied","Data":"3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.320702 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bkx4p" event={"ID":"9fe9f0aa-6448-48d7-900d-a8d5646a1a6a","Type":"ContainerDied","Data":"88616beb171cc11401e3cb0f87315c8133e576797eb28800b51ba4a6cd06fd71"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.320787 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bkx4p" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.327806 4819 generic.go:334] "Generic (PLEG): container finished" podID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerID="0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61" exitCode=0 Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.327892 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerDied","Data":"0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.327927 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hnqz" event={"ID":"2b9aa27e-76e7-4507-bca7-0ee08ff3a968","Type":"ContainerDied","Data":"015578329e0e9e690669b26b6f2f63027e9135143e69092df9aa14013c6e69fa"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.328038 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hnqz" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.332594 4819 generic.go:334] "Generic (PLEG): container finished" podID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerID="1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611" exitCode=0 Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.332692 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" event={"ID":"0a08aebb-db7e-488c-b992-2286ba6c9fd0","Type":"ContainerDied","Data":"1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.332732 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" event={"ID":"0a08aebb-db7e-488c-b992-2286ba6c9fd0","Type":"ContainerDied","Data":"aadc38e72b33d5a0a6160502cbdfa16994bdcad2504e525fc67a0f5f8227a7b3"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.332837 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nb49n" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.333904 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" event={"ID":"15512f8d-a53e-47cb-9b22-b8f8f410d65d","Type":"ContainerStarted","Data":"070f1d9a5dabb7452d34166d6045a77b1ebc66244a3f49cade82110efa03ab2d"} Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.339435 4819 scope.go:117] "RemoveContainer" containerID="13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.377732 4819 scope.go:117] "RemoveContainer" containerID="86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.381816 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgv4"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.387816 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgv4"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.393326 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhxpx"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.400962 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zhxpx"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.405928 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hnqz"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.406366 4819 scope.go:117] "RemoveContainer" containerID="1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.406853 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b\": container with ID starting with 1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b not found: ID does not exist" containerID="1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.406917 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b"} err="failed to get container status \"1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b\": rpc error: code = NotFound desc = could not find container \"1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b\": container with ID starting with 1b4d83044fde6268bbb3a08e20afd72923200021f29b6f305693060f3588922b not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.406954 4819 scope.go:117] "RemoveContainer" containerID="13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.407230 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2\": container with ID starting with 13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2 not found: ID does not exist" containerID="13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.407285 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2"} err="failed to get container status \"13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2\": rpc error: code = NotFound desc = could not find container \"13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2\": container with ID starting with 13dea61afcde8afc10538d88b477067a7fbe2e0688ac00b15f27ace89f4b54a2 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.407310 4819 scope.go:117] "RemoveContainer" containerID="86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.408992 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65\": container with ID starting with 86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65 not found: ID does not exist" containerID="86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.409025 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65"} err="failed to get container status \"86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65\": rpc error: code = NotFound desc = could not find container \"86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65\": container with ID starting with 86b08bc5d4f872cd21c5956c8f0ce593e136e0ed25575cb38e149fd34b593f65 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.409052 4819 scope.go:117] "RemoveContainer" containerID="f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.409762 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hnqz"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.412954 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bkx4p"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.415584 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bkx4p"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.418371 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nb49n"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.420545 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nb49n"] Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.446107 4819 scope.go:117] "RemoveContainer" containerID="0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.461984 4819 scope.go:117] "RemoveContainer" containerID="3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.477273 4819 scope.go:117] "RemoveContainer" containerID="f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.478208 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7\": container with ID starting with f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7 not found: ID does not exist" containerID="f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.478260 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7"} err="failed to get container status \"f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7\": rpc error: code = NotFound desc = could not find container \"f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7\": container with ID starting with f0aebdab36e68d236fee5d9f3dcdd27dd63fafa0c3e597e5980a90ac84b9b7d7 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.478285 4819 scope.go:117] "RemoveContainer" containerID="0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.478695 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec\": container with ID starting with 0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec not found: ID does not exist" containerID="0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.478730 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec"} err="failed to get container status \"0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec\": rpc error: code = NotFound desc = could not find container \"0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec\": container with ID starting with 0b203a36dfd19e346675ab907d79a0292a057e015f8b203451dc7e1d262b68ec not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.478756 4819 scope.go:117] "RemoveContainer" containerID="3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.479067 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2\": container with ID starting with 3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2 not found: ID does not exist" containerID="3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.479093 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2"} err="failed to get container status \"3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2\": rpc error: code = NotFound desc = could not find container \"3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2\": container with ID starting with 3fd5b29fe4b10c27148f9ca8e965828bf738c2b6dadc01e258fa40cb4c4f98d2 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.479110 4819 scope.go:117] "RemoveContainer" containerID="3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.492928 4819 scope.go:117] "RemoveContainer" containerID="4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.507413 4819 scope.go:117] "RemoveContainer" containerID="2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.525670 4819 scope.go:117] "RemoveContainer" containerID="3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.525934 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706\": container with ID starting with 3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706 not found: ID does not exist" containerID="3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.525961 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706"} err="failed to get container status \"3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706\": rpc error: code = NotFound desc = could not find container \"3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706\": container with ID starting with 3076dc5de2899b1617c391652aef7b7da1e30d0719a3cef5b5fc5c092d958706 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.525982 4819 scope.go:117] "RemoveContainer" containerID="4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.526229 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1\": container with ID starting with 4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1 not found: ID does not exist" containerID="4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.526270 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1"} err="failed to get container status \"4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1\": rpc error: code = NotFound desc = could not find container \"4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1\": container with ID starting with 4c5df239785118990f9b37ca1db9e176ca23348525d11c2194af41f169e43da1 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.526284 4819 scope.go:117] "RemoveContainer" containerID="2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.526517 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895\": container with ID starting with 2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895 not found: ID does not exist" containerID="2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.526536 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895"} err="failed to get container status \"2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895\": rpc error: code = NotFound desc = could not find container \"2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895\": container with ID starting with 2f6eda87d4dba0af09fb552755ebe8f10ed9c1015dde172b938cad4a8525c895 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.526548 4819 scope.go:117] "RemoveContainer" containerID="0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.540631 4819 scope.go:117] "RemoveContainer" containerID="8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.559192 4819 scope.go:117] "RemoveContainer" containerID="e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.571287 4819 scope.go:117] "RemoveContainer" containerID="0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.571596 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61\": container with ID starting with 0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61 not found: ID does not exist" containerID="0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.571649 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61"} err="failed to get container status \"0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61\": rpc error: code = NotFound desc = could not find container \"0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61\": container with ID starting with 0212296d82385e1e4cdbe9334d668b68909bbf2ea4bf628d010ac91d0453ef61 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.571677 4819 scope.go:117] "RemoveContainer" containerID="8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.571970 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659\": container with ID starting with 8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659 not found: ID does not exist" containerID="8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.571990 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659"} err="failed to get container status \"8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659\": rpc error: code = NotFound desc = could not find container \"8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659\": container with ID starting with 8f3fb53971f7c9c39dcb978ab7e3fd4c152bf66c1b119ea3113a5d1f332f9659 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.572002 4819 scope.go:117] "RemoveContainer" containerID="e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.572218 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb\": container with ID starting with e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb not found: ID does not exist" containerID="e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.572267 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb"} err="failed to get container status \"e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb\": rpc error: code = NotFound desc = could not find container \"e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb\": container with ID starting with e6db8a5be53e66ab8050788e1bb7a99d56b1f7f522b5a78290d642087ad168eb not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.572288 4819 scope.go:117] "RemoveContainer" containerID="1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.584004 4819 scope.go:117] "RemoveContainer" containerID="332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.598991 4819 scope.go:117] "RemoveContainer" containerID="1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.599372 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611\": container with ID starting with 1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611 not found: ID does not exist" containerID="1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.599398 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611"} err="failed to get container status \"1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611\": rpc error: code = NotFound desc = could not find container \"1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611\": container with ID starting with 1048e5053c71837d665ac84b3c6517d3956d8fbbf39b7eaf0d8595814f3dd611 not found: ID does not exist" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.599416 4819 scope.go:117] "RemoveContainer" containerID="332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8" Feb 28 03:41:33 crc kubenswrapper[4819]: E0228 03:41:33.599632 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8\": container with ID starting with 332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8 not found: ID does not exist" containerID="332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8" Feb 28 03:41:33 crc kubenswrapper[4819]: I0228 03:41:33.599652 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8"} err="failed to get container status \"332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8\": rpc error: code = NotFound desc = could not find container \"332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8\": container with ID starting with 332630b1f7af52b96eda1fabb60d03c68aee9f7f15ea480856d86e3062e988d8 not found: ID does not exist" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.349459 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" event={"ID":"15512f8d-a53e-47cb-9b22-b8f8f410d65d","Type":"ContainerStarted","Data":"962ddd67f4e2dc6060a00689975c2a8311308ff2283b777db4218af3973916aa"} Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.349951 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.353429 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.376740 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wnnt5" podStartSLOduration=2.376713968 podStartE2EDuration="2.376713968s" podCreationTimestamp="2026-02-28 03:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:41:34.371406415 +0000 UTC m=+432.836975293" watchObservedRunningTime="2026-02-28 03:41:34.376713968 +0000 UTC m=+432.842282866" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.380207 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" path="/var/lib/kubelet/pods/0a08aebb-db7e-488c-b992-2286ba6c9fd0/volumes" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.381420 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" path="/var/lib/kubelet/pods/2b9aa27e-76e7-4507-bca7-0ee08ff3a968/volumes" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.384438 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" path="/var/lib/kubelet/pods/6ef2ca7d-0b4b-4efa-aaad-af4137689efa/volumes" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.390233 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" path="/var/lib/kubelet/pods/8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8/volumes" Feb 28 03:41:34 crc kubenswrapper[4819]: I0228 03:41:34.391918 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" path="/var/lib/kubelet/pods/9fe9f0aa-6448-48d7-900d-a8d5646a1a6a/volumes" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.052399 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r2rnj"] Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053420 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053442 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053460 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053473 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053487 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053499 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053523 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053535 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053553 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053565 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053581 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053593 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053610 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053622 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053640 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053652 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053670 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053685 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053700 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053711 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053723 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053735 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="extract-content" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053748 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053759 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053772 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053784 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="extract-utilities" Feb 28 03:41:41 crc kubenswrapper[4819]: E0228 03:41:41.053804 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053815 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053972 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9aa27e-76e7-4507-bca7-0ee08ff3a968" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.053994 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.054010 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fe9f0aa-6448-48d7-900d-a8d5646a1a6a" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.054040 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8758ef-cad6-4c28-bc87-bea1b0c7d2b8" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.054057 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef2ca7d-0b4b-4efa-aaad-af4137689efa" containerName="registry-server" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.054635 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.108152 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r2rnj"] Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203300 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/660d55a3-785b-4e48-8233-745276176c4a-trusted-ca\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203347 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/660d55a3-785b-4e48-8233-745276176c4a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203376 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/660d55a3-785b-4e48-8233-745276176c4a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203482 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-registry-tls\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203510 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp6c8\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-kube-api-access-fp6c8\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203551 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203584 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-bound-sa-token\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.203616 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/660d55a3-785b-4e48-8233-745276176c4a-registry-certificates\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.238306 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.258882 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t9qpv"] Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.259381 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a08aebb-db7e-488c-b992-2286ba6c9fd0" containerName="marketplace-operator" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.260565 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.265308 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.271205 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t9qpv"] Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305153 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/660d55a3-785b-4e48-8233-745276176c4a-registry-certificates\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305223 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/660d55a3-785b-4e48-8233-745276176c4a-trusted-ca\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305270 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/660d55a3-785b-4e48-8233-745276176c4a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305294 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/660d55a3-785b-4e48-8233-745276176c4a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305337 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-registry-tls\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305360 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp6c8\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-kube-api-access-fp6c8\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.305402 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-bound-sa-token\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.306209 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/660d55a3-785b-4e48-8233-745276176c4a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.306289 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/660d55a3-785b-4e48-8233-745276176c4a-registry-certificates\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.306926 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/660d55a3-785b-4e48-8233-745276176c4a-trusted-ca\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.313219 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/660d55a3-785b-4e48-8233-745276176c4a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.314659 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-registry-tls\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.328035 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-bound-sa-token\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.330575 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp6c8\" (UniqueName: \"kubernetes.io/projected/660d55a3-785b-4e48-8233-745276176c4a-kube-api-access-fp6c8\") pod \"image-registry-66df7c8f76-r2rnj\" (UID: \"660d55a3-785b-4e48-8233-745276176c4a\") " pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.377790 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.406282 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b971502b-767d-422a-94cf-71377b40763d-utilities\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.406347 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b971502b-767d-422a-94cf-71377b40763d-catalog-content\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.406364 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hghkq\" (UniqueName: \"kubernetes.io/projected/b971502b-767d-422a-94cf-71377b40763d-kube-api-access-hghkq\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.517522 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b971502b-767d-422a-94cf-71377b40763d-utilities\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.517904 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b971502b-767d-422a-94cf-71377b40763d-catalog-content\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.517945 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hghkq\" (UniqueName: \"kubernetes.io/projected/b971502b-767d-422a-94cf-71377b40763d-kube-api-access-hghkq\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.518377 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b971502b-767d-422a-94cf-71377b40763d-utilities\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.518803 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b971502b-767d-422a-94cf-71377b40763d-catalog-content\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.547727 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hghkq\" (UniqueName: \"kubernetes.io/projected/b971502b-767d-422a-94cf-71377b40763d-kube-api-access-hghkq\") pod \"certified-operators-t9qpv\" (UID: \"b971502b-767d-422a-94cf-71377b40763d\") " pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.575437 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.665564 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r2rnj"] Feb 28 03:41:41 crc kubenswrapper[4819]: W0228 03:41:41.681199 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660d55a3_785b_4e48_8233_745276176c4a.slice/crio-06840e9ec1265ce02161a9b7ac8cc95ce8cee21b86b715ecb78fed92b4cdf185 WatchSource:0}: Error finding container 06840e9ec1265ce02161a9b7ac8cc95ce8cee21b86b715ecb78fed92b4cdf185: Status 404 returned error can't find the container with id 06840e9ec1265ce02161a9b7ac8cc95ce8cee21b86b715ecb78fed92b4cdf185 Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.822893 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t9qpv"] Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.847892 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s552m"] Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.850524 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.852796 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.861020 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s552m"] Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.924994 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d52af-88db-45b1-9d8f-2023d6116b4d-utilities\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.925095 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmqss\" (UniqueName: \"kubernetes.io/projected/cb9d52af-88db-45b1-9d8f-2023d6116b4d-kube-api-access-lmqss\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:41 crc kubenswrapper[4819]: I0228 03:41:41.925146 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d52af-88db-45b1-9d8f-2023d6116b4d-catalog-content\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.026780 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d52af-88db-45b1-9d8f-2023d6116b4d-utilities\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.026836 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmqss\" (UniqueName: \"kubernetes.io/projected/cb9d52af-88db-45b1-9d8f-2023d6116b4d-kube-api-access-lmqss\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.026861 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d52af-88db-45b1-9d8f-2023d6116b4d-catalog-content\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.027376 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d52af-88db-45b1-9d8f-2023d6116b4d-catalog-content\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.027557 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d52af-88db-45b1-9d8f-2023d6116b4d-utilities\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.053100 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmqss\" (UniqueName: \"kubernetes.io/projected/cb9d52af-88db-45b1-9d8f-2023d6116b4d-kube-api-access-lmqss\") pod \"redhat-marketplace-s552m\" (UID: \"cb9d52af-88db-45b1-9d8f-2023d6116b4d\") " pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.174828 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.434022 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s552m"] Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.436919 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" event={"ID":"660d55a3-785b-4e48-8233-745276176c4a","Type":"ContainerStarted","Data":"d23137ab9dcc48e08938db41a74e2139abc464c8b8ccd3bb6d36805669f5ee6b"} Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.436958 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" event={"ID":"660d55a3-785b-4e48-8233-745276176c4a","Type":"ContainerStarted","Data":"06840e9ec1265ce02161a9b7ac8cc95ce8cee21b86b715ecb78fed92b4cdf185"} Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.437124 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.440049 4819 generic.go:334] "Generic (PLEG): container finished" podID="b971502b-767d-422a-94cf-71377b40763d" containerID="c5ea7216c03a5456c30b3067dfb66f5cbc34ceb02b6ad2470bd84daa4108893b" exitCode=0 Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.440084 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9qpv" event={"ID":"b971502b-767d-422a-94cf-71377b40763d","Type":"ContainerDied","Data":"c5ea7216c03a5456c30b3067dfb66f5cbc34ceb02b6ad2470bd84daa4108893b"} Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.440106 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9qpv" event={"ID":"b971502b-767d-422a-94cf-71377b40763d","Type":"ContainerStarted","Data":"9a27934e76eccdddddf8b7f496f2e249a5e2a322c81f1528e76e0bcedd545eaf"} Feb 28 03:41:42 crc kubenswrapper[4819]: I0228 03:41:42.457497 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" podStartSLOduration=1.457459121 podStartE2EDuration="1.457459121s" podCreationTimestamp="2026-02-28 03:41:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:41:42.45304337 +0000 UTC m=+440.918612228" watchObservedRunningTime="2026-02-28 03:41:42.457459121 +0000 UTC m=+440.923028009" Feb 28 03:41:42 crc kubenswrapper[4819]: W0228 03:41:42.461055 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9d52af_88db_45b1_9d8f_2023d6116b4d.slice/crio-4cb375d7060b7865eef4e36263148ee95017bf129a3a09561d82907f2590d1e4 WatchSource:0}: Error finding container 4cb375d7060b7865eef4e36263148ee95017bf129a3a09561d82907f2590d1e4: Status 404 returned error can't find the container with id 4cb375d7060b7865eef4e36263148ee95017bf129a3a09561d82907f2590d1e4 Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.455345 4819 generic.go:334] "Generic (PLEG): container finished" podID="cb9d52af-88db-45b1-9d8f-2023d6116b4d" containerID="dc150d1af8b452f5c5048abd024b6786ae32096f3e40b28de32b890fc7d0a298" exitCode=0 Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.455458 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s552m" event={"ID":"cb9d52af-88db-45b1-9d8f-2023d6116b4d","Type":"ContainerDied","Data":"dc150d1af8b452f5c5048abd024b6786ae32096f3e40b28de32b890fc7d0a298"} Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.455497 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s552m" event={"ID":"cb9d52af-88db-45b1-9d8f-2023d6116b4d","Type":"ContainerStarted","Data":"4cb375d7060b7865eef4e36263148ee95017bf129a3a09561d82907f2590d1e4"} Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.466081 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9qpv" event={"ID":"b971502b-767d-422a-94cf-71377b40763d","Type":"ContainerStarted","Data":"8be34e6b2d348319aff9f6fb5d641161f612592d163571e5a55247a9ec907361"} Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.656386 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fxn87"] Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.660510 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.661746 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97da816f-e1cf-43dc-b0ae-d78c31a33a19-catalog-content\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.661809 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97da816f-e1cf-43dc-b0ae-d78c31a33a19-utilities\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.661835 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnls\" (UniqueName: \"kubernetes.io/projected/97da816f-e1cf-43dc-b0ae-d78c31a33a19-kube-api-access-8mnls\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.668053 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.670216 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxn87"] Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.763623 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97da816f-e1cf-43dc-b0ae-d78c31a33a19-utilities\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.765077 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97da816f-e1cf-43dc-b0ae-d78c31a33a19-utilities\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.765419 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnls\" (UniqueName: \"kubernetes.io/projected/97da816f-e1cf-43dc-b0ae-d78c31a33a19-kube-api-access-8mnls\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.765723 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97da816f-e1cf-43dc-b0ae-d78c31a33a19-catalog-content\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.766391 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97da816f-e1cf-43dc-b0ae-d78c31a33a19-catalog-content\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.799365 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnls\" (UniqueName: \"kubernetes.io/projected/97da816f-e1cf-43dc-b0ae-d78c31a33a19-kube-api-access-8mnls\") pod \"community-operators-fxn87\" (UID: \"97da816f-e1cf-43dc-b0ae-d78c31a33a19\") " pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:43 crc kubenswrapper[4819]: I0228 03:41:43.994870 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.286529 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxn87"] Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.470347 4819 generic.go:334] "Generic (PLEG): container finished" podID="cb9d52af-88db-45b1-9d8f-2023d6116b4d" containerID="db5d7ae36d82d3ef49b5488a4c76961a86f8b39948ba1c443822a3670f9df0a1" exitCode=0 Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.470384 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s552m" event={"ID":"cb9d52af-88db-45b1-9d8f-2023d6116b4d","Type":"ContainerDied","Data":"db5d7ae36d82d3ef49b5488a4c76961a86f8b39948ba1c443822a3670f9df0a1"} Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.475540 4819 generic.go:334] "Generic (PLEG): container finished" podID="b971502b-767d-422a-94cf-71377b40763d" containerID="8be34e6b2d348319aff9f6fb5d641161f612592d163571e5a55247a9ec907361" exitCode=0 Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.475590 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9qpv" event={"ID":"b971502b-767d-422a-94cf-71377b40763d","Type":"ContainerDied","Data":"8be34e6b2d348319aff9f6fb5d641161f612592d163571e5a55247a9ec907361"} Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.483844 4819 generic.go:334] "Generic (PLEG): container finished" podID="97da816f-e1cf-43dc-b0ae-d78c31a33a19" containerID="eb052fb8fc48cfef70a884f3d841eb1d25fe9885be2f221fd82ac2fa6afc2cea" exitCode=0 Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.483873 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxn87" event={"ID":"97da816f-e1cf-43dc-b0ae-d78c31a33a19","Type":"ContainerDied","Data":"eb052fb8fc48cfef70a884f3d841eb1d25fe9885be2f221fd82ac2fa6afc2cea"} Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.483891 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxn87" event={"ID":"97da816f-e1cf-43dc-b0ae-d78c31a33a19","Type":"ContainerStarted","Data":"522302fc350b7a849b4a9f573b8af68fbed72900a15a23e86de9a98d85006dfb"} Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.655444 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9fft7"] Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.656652 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.659284 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.676754 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9fft7"] Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.779472 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6ct7\" (UniqueName: \"kubernetes.io/projected/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-kube-api-access-t6ct7\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.779589 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-catalog-content\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.779626 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-utilities\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.880441 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6ct7\" (UniqueName: \"kubernetes.io/projected/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-kube-api-access-t6ct7\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.880586 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-catalog-content\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.880634 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-utilities\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.881312 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-utilities\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.882214 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-catalog-content\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:44 crc kubenswrapper[4819]: I0228 03:41:44.902314 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6ct7\" (UniqueName: \"kubernetes.io/projected/a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d-kube-api-access-t6ct7\") pod \"redhat-operators-9fft7\" (UID: \"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d\") " pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:45 crc kubenswrapper[4819]: I0228 03:41:44.998465 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:45 crc kubenswrapper[4819]: I0228 03:41:45.341735 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9fft7"] Feb 28 03:41:45 crc kubenswrapper[4819]: W0228 03:41:45.365332 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5d1a220_cc5c_4631_b4ef_2fc4321b1b7d.slice/crio-60b53b32a3c9862519552bb06f06c5c4bfa3f435ab0eb2a156f7b8632f291b94 WatchSource:0}: Error finding container 60b53b32a3c9862519552bb06f06c5c4bfa3f435ab0eb2a156f7b8632f291b94: Status 404 returned error can't find the container with id 60b53b32a3c9862519552bb06f06c5c4bfa3f435ab0eb2a156f7b8632f291b94 Feb 28 03:41:45 crc kubenswrapper[4819]: I0228 03:41:45.499130 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fft7" event={"ID":"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d","Type":"ContainerStarted","Data":"60b53b32a3c9862519552bb06f06c5c4bfa3f435ab0eb2a156f7b8632f291b94"} Feb 28 03:41:45 crc kubenswrapper[4819]: I0228 03:41:45.504041 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s552m" event={"ID":"cb9d52af-88db-45b1-9d8f-2023d6116b4d","Type":"ContainerStarted","Data":"44f09cffd1fa97f39d3dc0d9d3a0f60a37c2c818542bd0c704571a7535e798fd"} Feb 28 03:41:45 crc kubenswrapper[4819]: I0228 03:41:45.510824 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t9qpv" event={"ID":"b971502b-767d-422a-94cf-71377b40763d","Type":"ContainerStarted","Data":"2a7c10c54431128ae97ac83cd785832c14c662c034ce0e6d8d652321e11f6ab0"} Feb 28 03:41:45 crc kubenswrapper[4819]: I0228 03:41:45.531060 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s552m" podStartSLOduration=3.053995926 podStartE2EDuration="4.531036831s" podCreationTimestamp="2026-02-28 03:41:41 +0000 UTC" firstStartedPulling="2026-02-28 03:41:43.457067646 +0000 UTC m=+441.922636534" lastFinishedPulling="2026-02-28 03:41:44.934108541 +0000 UTC m=+443.399677439" observedRunningTime="2026-02-28 03:41:45.530621581 +0000 UTC m=+443.996190459" watchObservedRunningTime="2026-02-28 03:41:45.531036831 +0000 UTC m=+443.996605699" Feb 28 03:41:46 crc kubenswrapper[4819]: I0228 03:41:46.518201 4819 generic.go:334] "Generic (PLEG): container finished" podID="a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d" containerID="f8abfd2d2612568a5d4b4b545b6d3f95c0b9bb81f8758f249e6c3d6014f83e1c" exitCode=0 Feb 28 03:41:46 crc kubenswrapper[4819]: I0228 03:41:46.518303 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fft7" event={"ID":"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d","Type":"ContainerDied","Data":"f8abfd2d2612568a5d4b4b545b6d3f95c0b9bb81f8758f249e6c3d6014f83e1c"} Feb 28 03:41:46 crc kubenswrapper[4819]: I0228 03:41:46.521512 4819 generic.go:334] "Generic (PLEG): container finished" podID="97da816f-e1cf-43dc-b0ae-d78c31a33a19" containerID="b8a0f4a360ca0d8a3b1267cc09defd62a25a197b8b3fef3512d863850a7cdf18" exitCode=0 Feb 28 03:41:46 crc kubenswrapper[4819]: I0228 03:41:46.522662 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxn87" event={"ID":"97da816f-e1cf-43dc-b0ae-d78c31a33a19","Type":"ContainerDied","Data":"b8a0f4a360ca0d8a3b1267cc09defd62a25a197b8b3fef3512d863850a7cdf18"} Feb 28 03:41:46 crc kubenswrapper[4819]: I0228 03:41:46.542599 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t9qpv" podStartSLOduration=2.884871252 podStartE2EDuration="5.542585246s" podCreationTimestamp="2026-02-28 03:41:41 +0000 UTC" firstStartedPulling="2026-02-28 03:41:42.452503847 +0000 UTC m=+440.918072715" lastFinishedPulling="2026-02-28 03:41:45.110217851 +0000 UTC m=+443.575786709" observedRunningTime="2026-02-28 03:41:45.547752401 +0000 UTC m=+444.013321259" watchObservedRunningTime="2026-02-28 03:41:46.542585246 +0000 UTC m=+445.008154104" Feb 28 03:41:47 crc kubenswrapper[4819]: I0228 03:41:47.544379 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxn87" event={"ID":"97da816f-e1cf-43dc-b0ae-d78c31a33a19","Type":"ContainerStarted","Data":"7ac3871021b4f2fbbb676c6888d412cd5dd23f8b531d1f18be5e5695f624fce2"} Feb 28 03:41:47 crc kubenswrapper[4819]: I0228 03:41:47.547320 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fft7" event={"ID":"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d","Type":"ContainerStarted","Data":"9600c81d575f59e0c6f6314dbb9ae4f0d0d84f3efe0314b92371bef844586cc1"} Feb 28 03:41:47 crc kubenswrapper[4819]: I0228 03:41:47.566069 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fxn87" podStartSLOduration=2.155039506 podStartE2EDuration="4.566050679s" podCreationTimestamp="2026-02-28 03:41:43 +0000 UTC" firstStartedPulling="2026-02-28 03:41:44.484809527 +0000 UTC m=+442.950378385" lastFinishedPulling="2026-02-28 03:41:46.89582069 +0000 UTC m=+445.361389558" observedRunningTime="2026-02-28 03:41:47.563828853 +0000 UTC m=+446.029397711" watchObservedRunningTime="2026-02-28 03:41:47.566050679 +0000 UTC m=+446.031619537" Feb 28 03:41:48 crc kubenswrapper[4819]: I0228 03:41:48.555291 4819 generic.go:334] "Generic (PLEG): container finished" podID="a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d" containerID="9600c81d575f59e0c6f6314dbb9ae4f0d0d84f3efe0314b92371bef844586cc1" exitCode=0 Feb 28 03:41:48 crc kubenswrapper[4819]: I0228 03:41:48.555423 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fft7" event={"ID":"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d","Type":"ContainerDied","Data":"9600c81d575f59e0c6f6314dbb9ae4f0d0d84f3efe0314b92371bef844586cc1"} Feb 28 03:41:49 crc kubenswrapper[4819]: I0228 03:41:49.564837 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fft7" event={"ID":"a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d","Type":"ContainerStarted","Data":"3cc19e368304fa4968b08b54d45031167ab6b0656d4622be84eb0316b4373c0c"} Feb 28 03:41:49 crc kubenswrapper[4819]: I0228 03:41:49.597525 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9fft7" podStartSLOduration=3.004650251 podStartE2EDuration="5.597467306s" podCreationTimestamp="2026-02-28 03:41:44 +0000 UTC" firstStartedPulling="2026-02-28 03:41:46.519762813 +0000 UTC m=+444.985331681" lastFinishedPulling="2026-02-28 03:41:49.112579868 +0000 UTC m=+447.578148736" observedRunningTime="2026-02-28 03:41:49.589498956 +0000 UTC m=+448.055067894" watchObservedRunningTime="2026-02-28 03:41:49.597467306 +0000 UTC m=+448.063036204" Feb 28 03:41:51 crc kubenswrapper[4819]: I0228 03:41:51.576019 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:51 crc kubenswrapper[4819]: I0228 03:41:51.576415 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:51 crc kubenswrapper[4819]: I0228 03:41:51.613829 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:52 crc kubenswrapper[4819]: I0228 03:41:52.175539 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:52 crc kubenswrapper[4819]: I0228 03:41:52.175692 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:52 crc kubenswrapper[4819]: I0228 03:41:52.246226 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:52 crc kubenswrapper[4819]: I0228 03:41:52.638809 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s552m" Feb 28 03:41:52 crc kubenswrapper[4819]: I0228 03:41:52.666996 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t9qpv" Feb 28 03:41:53 crc kubenswrapper[4819]: I0228 03:41:53.996593 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:53 crc kubenswrapper[4819]: I0228 03:41:53.996677 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:54 crc kubenswrapper[4819]: I0228 03:41:54.064421 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:54 crc kubenswrapper[4819]: I0228 03:41:54.670217 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fxn87" Feb 28 03:41:54 crc kubenswrapper[4819]: I0228 03:41:54.999347 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:55 crc kubenswrapper[4819]: I0228 03:41:54.999806 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:41:56 crc kubenswrapper[4819]: I0228 03:41:56.063804 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9fft7" podUID="a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d" containerName="registry-server" probeResult="failure" output=< Feb 28 03:41:56 crc kubenswrapper[4819]: timeout: failed to connect service ":50051" within 1s Feb 28 03:41:56 crc kubenswrapper[4819]: > Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.137960 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537502-g97bp"] Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.139365 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.143312 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.143397 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.144566 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.156062 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537502-g97bp"] Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.235932 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswrg\" (UniqueName: \"kubernetes.io/projected/4b6d3d52-2b23-41b9-856b-decde2205a0c-kube-api-access-dswrg\") pod \"auto-csr-approver-29537502-g97bp\" (UID: \"4b6d3d52-2b23-41b9-856b-decde2205a0c\") " pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.337655 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dswrg\" (UniqueName: \"kubernetes.io/projected/4b6d3d52-2b23-41b9-856b-decde2205a0c-kube-api-access-dswrg\") pod \"auto-csr-approver-29537502-g97bp\" (UID: \"4b6d3d52-2b23-41b9-856b-decde2205a0c\") " pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.360261 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswrg\" (UniqueName: \"kubernetes.io/projected/4b6d3d52-2b23-41b9-856b-decde2205a0c-kube-api-access-dswrg\") pod \"auto-csr-approver-29537502-g97bp\" (UID: \"4b6d3d52-2b23-41b9-856b-decde2205a0c\") " pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.482505 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.731035 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537502-g97bp"] Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.834615 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:42:00 crc kubenswrapper[4819]: I0228 03:42:00.834680 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:42:01 crc kubenswrapper[4819]: I0228 03:42:01.384476 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-r2rnj" Feb 28 03:42:01 crc kubenswrapper[4819]: I0228 03:42:01.486626 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5w8xg"] Feb 28 03:42:01 crc kubenswrapper[4819]: I0228 03:42:01.676698 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537502-g97bp" event={"ID":"4b6d3d52-2b23-41b9-856b-decde2205a0c","Type":"ContainerStarted","Data":"babb97f958a7df9c485f484a9b637768d380697cfd5ce2028d485b71c4a905e7"} Feb 28 03:42:02 crc kubenswrapper[4819]: I0228 03:42:02.682788 4819 generic.go:334] "Generic (PLEG): container finished" podID="4b6d3d52-2b23-41b9-856b-decde2205a0c" containerID="f7f3fb79f5c75f1d3347eace08b6396f6d2f6367488d29505e5a0285c6cc03ef" exitCode=0 Feb 28 03:42:02 crc kubenswrapper[4819]: I0228 03:42:02.683291 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537502-g97bp" event={"ID":"4b6d3d52-2b23-41b9-856b-decde2205a0c","Type":"ContainerDied","Data":"f7f3fb79f5c75f1d3347eace08b6396f6d2f6367488d29505e5a0285c6cc03ef"} Feb 28 03:42:03 crc kubenswrapper[4819]: I0228 03:42:03.899421 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.089181 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dswrg\" (UniqueName: \"kubernetes.io/projected/4b6d3d52-2b23-41b9-856b-decde2205a0c-kube-api-access-dswrg\") pod \"4b6d3d52-2b23-41b9-856b-decde2205a0c\" (UID: \"4b6d3d52-2b23-41b9-856b-decde2205a0c\") " Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.096056 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6d3d52-2b23-41b9-856b-decde2205a0c-kube-api-access-dswrg" (OuterVolumeSpecName: "kube-api-access-dswrg") pod "4b6d3d52-2b23-41b9-856b-decde2205a0c" (UID: "4b6d3d52-2b23-41b9-856b-decde2205a0c"). InnerVolumeSpecName "kube-api-access-dswrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.190679 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dswrg\" (UniqueName: \"kubernetes.io/projected/4b6d3d52-2b23-41b9-856b-decde2205a0c-kube-api-access-dswrg\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.696552 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537502-g97bp" event={"ID":"4b6d3d52-2b23-41b9-856b-decde2205a0c","Type":"ContainerDied","Data":"babb97f958a7df9c485f484a9b637768d380697cfd5ce2028d485b71c4a905e7"} Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.697046 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="babb97f958a7df9c485f484a9b637768d380697cfd5ce2028d485b71c4a905e7" Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.696634 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537502-g97bp" Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.963896 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537496-dqglv"] Feb 28 03:42:04 crc kubenswrapper[4819]: I0228 03:42:04.966704 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537496-dqglv"] Feb 28 03:42:05 crc kubenswrapper[4819]: I0228 03:42:05.055003 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:42:05 crc kubenswrapper[4819]: I0228 03:42:05.115068 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9fft7" Feb 28 03:42:06 crc kubenswrapper[4819]: I0228 03:42:06.380060 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ec7136-9dc9-47c2-bf41-7b798e6bfe60" path="/var/lib/kubelet/pods/23ec7136-9dc9-47c2-bf41-7b798e6bfe60/volumes" Feb 28 03:42:26 crc kubenswrapper[4819]: I0228 03:42:26.710131 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" podUID="553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" containerName="registry" containerID="cri-o://e322a2740de3b94d2c2051019d809947c93aaa50b2645aea27400ae4a384084a" gracePeriod=30 Feb 28 03:42:26 crc kubenswrapper[4819]: I0228 03:42:26.844937 4819 generic.go:334] "Generic (PLEG): container finished" podID="553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" containerID="e322a2740de3b94d2c2051019d809947c93aaa50b2645aea27400ae4a384084a" exitCode=0 Feb 28 03:42:26 crc kubenswrapper[4819]: I0228 03:42:26.845007 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" event={"ID":"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f","Type":"ContainerDied","Data":"e322a2740de3b94d2c2051019d809947c93aaa50b2645aea27400ae4a384084a"} Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.153604 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.319774 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf7n9\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-kube-api-access-vf7n9\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.319861 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-trusted-ca\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.319920 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-certificates\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.319968 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-installation-pull-secrets\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.320005 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-ca-trust-extracted\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.321419 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.321497 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.321636 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-bound-sa-token\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.321730 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-tls\") pod \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\" (UID: \"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f\") " Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.322151 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.322191 4819 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.329405 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.330725 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.332040 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.339909 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.342859 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-kube-api-access-vf7n9" (OuterVolumeSpecName: "kube-api-access-vf7n9") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "kube-api-access-vf7n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.355455 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" (UID: "553c6c36-5977-4ecd-a7b4-9ffb9a095d3f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.423873 4819 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.423931 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf7n9\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-kube-api-access-vf7n9\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.423955 4819 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.423974 4819 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.423992 4819 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.424009 4819 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.853865 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" event={"ID":"553c6c36-5977-4ecd-a7b4-9ffb9a095d3f","Type":"ContainerDied","Data":"432a0549be230e82b3b49e5950ead1df74092d8829d489dff3fbaeba732afda1"} Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.853937 4819 scope.go:117] "RemoveContainer" containerID="e322a2740de3b94d2c2051019d809947c93aaa50b2645aea27400ae4a384084a" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.853939 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5w8xg" Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.902478 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5w8xg"] Feb 28 03:42:27 crc kubenswrapper[4819]: I0228 03:42:27.911578 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5w8xg"] Feb 28 03:42:28 crc kubenswrapper[4819]: I0228 03:42:28.384678 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" path="/var/lib/kubelet/pods/553c6c36-5977-4ecd-a7b4-9ffb9a095d3f/volumes" Feb 28 03:42:30 crc kubenswrapper[4819]: I0228 03:42:30.834584 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:42:30 crc kubenswrapper[4819]: I0228 03:42:30.835051 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:43:00 crc kubenswrapper[4819]: I0228 03:43:00.834147 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:43:00 crc kubenswrapper[4819]: I0228 03:43:00.834986 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:43:00 crc kubenswrapper[4819]: I0228 03:43:00.835057 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:43:00 crc kubenswrapper[4819]: I0228 03:43:00.836116 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e1444d8a76cee2b7dbf599d5d429088251da1e9a8f9ace55b15b8ab10db4eaf"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 03:43:00 crc kubenswrapper[4819]: I0228 03:43:00.836225 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://2e1444d8a76cee2b7dbf599d5d429088251da1e9a8f9ace55b15b8ab10db4eaf" gracePeriod=600 Feb 28 03:43:01 crc kubenswrapper[4819]: I0228 03:43:01.098960 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="2e1444d8a76cee2b7dbf599d5d429088251da1e9a8f9ace55b15b8ab10db4eaf" exitCode=0 Feb 28 03:43:01 crc kubenswrapper[4819]: I0228 03:43:01.099019 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"2e1444d8a76cee2b7dbf599d5d429088251da1e9a8f9ace55b15b8ab10db4eaf"} Feb 28 03:43:01 crc kubenswrapper[4819]: I0228 03:43:01.100306 4819 scope.go:117] "RemoveContainer" containerID="edd76fb9a1ea893921f5867c01a4e612bc5be774d269ab9fd73a80f49a5762af" Feb 28 03:43:02 crc kubenswrapper[4819]: I0228 03:43:02.109219 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"5ade958a408b150f7a5061d92ed6be3f2480394555f495dbd9814681a29f7247"} Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.151925 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537504-dvppb"] Feb 28 03:44:00 crc kubenswrapper[4819]: E0228 03:44:00.152709 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" containerName="registry" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.152725 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" containerName="registry" Feb 28 03:44:00 crc kubenswrapper[4819]: E0228 03:44:00.152750 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6d3d52-2b23-41b9-856b-decde2205a0c" containerName="oc" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.152758 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6d3d52-2b23-41b9-856b-decde2205a0c" containerName="oc" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.152871 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6d3d52-2b23-41b9-856b-decde2205a0c" containerName="oc" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.152886 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="553c6c36-5977-4ecd-a7b4-9ffb9a095d3f" containerName="registry" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.153306 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.156687 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.156763 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.159369 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.161319 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537504-dvppb"] Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.263446 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknhd\" (UniqueName: \"kubernetes.io/projected/4e77f7fb-8d87-402f-8a5d-28e75ff27418-kube-api-access-sknhd\") pod \"auto-csr-approver-29537504-dvppb\" (UID: \"4e77f7fb-8d87-402f-8a5d-28e75ff27418\") " pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.364962 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sknhd\" (UniqueName: \"kubernetes.io/projected/4e77f7fb-8d87-402f-8a5d-28e75ff27418-kube-api-access-sknhd\") pod \"auto-csr-approver-29537504-dvppb\" (UID: \"4e77f7fb-8d87-402f-8a5d-28e75ff27418\") " pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.394685 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknhd\" (UniqueName: \"kubernetes.io/projected/4e77f7fb-8d87-402f-8a5d-28e75ff27418-kube-api-access-sknhd\") pod \"auto-csr-approver-29537504-dvppb\" (UID: \"4e77f7fb-8d87-402f-8a5d-28e75ff27418\") " pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.479887 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.720517 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537504-dvppb"] Feb 28 03:44:00 crc kubenswrapper[4819]: I0228 03:44:00.723720 4819 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 03:44:01 crc kubenswrapper[4819]: I0228 03:44:01.505305 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537504-dvppb" event={"ID":"4e77f7fb-8d87-402f-8a5d-28e75ff27418","Type":"ContainerStarted","Data":"cf67bd4e2c71ab875525d74edf197d3b4d79774b45b9c128cb8ea263dca5e348"} Feb 28 03:44:02 crc kubenswrapper[4819]: I0228 03:44:02.513348 4819 generic.go:334] "Generic (PLEG): container finished" podID="4e77f7fb-8d87-402f-8a5d-28e75ff27418" containerID="698393bb4ab5d3e2f2ba104e16efc862fdd69434a764aa0a71e40dbdec4c5f8d" exitCode=0 Feb 28 03:44:02 crc kubenswrapper[4819]: I0228 03:44:02.513712 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537504-dvppb" event={"ID":"4e77f7fb-8d87-402f-8a5d-28e75ff27418","Type":"ContainerDied","Data":"698393bb4ab5d3e2f2ba104e16efc862fdd69434a764aa0a71e40dbdec4c5f8d"} Feb 28 03:44:03 crc kubenswrapper[4819]: I0228 03:44:03.796939 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:03 crc kubenswrapper[4819]: I0228 03:44:03.914267 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sknhd\" (UniqueName: \"kubernetes.io/projected/4e77f7fb-8d87-402f-8a5d-28e75ff27418-kube-api-access-sknhd\") pod \"4e77f7fb-8d87-402f-8a5d-28e75ff27418\" (UID: \"4e77f7fb-8d87-402f-8a5d-28e75ff27418\") " Feb 28 03:44:03 crc kubenswrapper[4819]: I0228 03:44:03.921889 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e77f7fb-8d87-402f-8a5d-28e75ff27418-kube-api-access-sknhd" (OuterVolumeSpecName: "kube-api-access-sknhd") pod "4e77f7fb-8d87-402f-8a5d-28e75ff27418" (UID: "4e77f7fb-8d87-402f-8a5d-28e75ff27418"). InnerVolumeSpecName "kube-api-access-sknhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:44:04 crc kubenswrapper[4819]: I0228 03:44:04.016695 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sknhd\" (UniqueName: \"kubernetes.io/projected/4e77f7fb-8d87-402f-8a5d-28e75ff27418-kube-api-access-sknhd\") on node \"crc\" DevicePath \"\"" Feb 28 03:44:04 crc kubenswrapper[4819]: I0228 03:44:04.526776 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537504-dvppb" event={"ID":"4e77f7fb-8d87-402f-8a5d-28e75ff27418","Type":"ContainerDied","Data":"cf67bd4e2c71ab875525d74edf197d3b4d79774b45b9c128cb8ea263dca5e348"} Feb 28 03:44:04 crc kubenswrapper[4819]: I0228 03:44:04.526827 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf67bd4e2c71ab875525d74edf197d3b4d79774b45b9c128cb8ea263dca5e348" Feb 28 03:44:04 crc kubenswrapper[4819]: I0228 03:44:04.526828 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537504-dvppb" Feb 28 03:44:04 crc kubenswrapper[4819]: I0228 03:44:04.875780 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537498-5h22n"] Feb 28 03:44:04 crc kubenswrapper[4819]: I0228 03:44:04.886216 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537498-5h22n"] Feb 28 03:44:06 crc kubenswrapper[4819]: I0228 03:44:06.377387 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500425cb-63aa-43e1-bc7b-2c11c88826c5" path="/var/lib/kubelet/pods/500425cb-63aa-43e1-bc7b-2c11c88826c5/volumes" Feb 28 03:44:45 crc kubenswrapper[4819]: I0228 03:44:45.158918 4819 scope.go:117] "RemoveContainer" containerID="333624774a7f72cc869a41ab7cf3fe9690ef9370fe3a1edb460d898b9163deab" Feb 28 03:44:45 crc kubenswrapper[4819]: I0228 03:44:45.208718 4819 scope.go:117] "RemoveContainer" containerID="376bf5ca05bb87538fadf4786ccdaa2318653d06cdb4d4265201087d9b9385fd" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.157980 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d"] Feb 28 03:45:00 crc kubenswrapper[4819]: E0228 03:45:00.158960 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e77f7fb-8d87-402f-8a5d-28e75ff27418" containerName="oc" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.158982 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e77f7fb-8d87-402f-8a5d-28e75ff27418" containerName="oc" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.159184 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e77f7fb-8d87-402f-8a5d-28e75ff27418" containerName="oc" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.159843 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.162496 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.163284 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.173744 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d"] Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.220137 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-secret-volume\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.220200 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlvv\" (UniqueName: \"kubernetes.io/projected/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-kube-api-access-6wlvv\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.220397 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-config-volume\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.322172 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-secret-volume\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.322319 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlvv\" (UniqueName: \"kubernetes.io/projected/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-kube-api-access-6wlvv\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.322370 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-config-volume\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.324436 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-config-volume\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.331856 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-secret-volume\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.354185 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlvv\" (UniqueName: \"kubernetes.io/projected/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-kube-api-access-6wlvv\") pod \"collect-profiles-29537505-dc99d\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.498778 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.749390 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d"] Feb 28 03:45:00 crc kubenswrapper[4819]: W0228 03:45:00.756500 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd471b6_339e_43ca_8fc7_06a9bee8aa46.slice/crio-16472b842e7a87b9222688a52a4f46605b82d8493bdff2f6774374e1647f8cfc WatchSource:0}: Error finding container 16472b842e7a87b9222688a52a4f46605b82d8493bdff2f6774374e1647f8cfc: Status 404 returned error can't find the container with id 16472b842e7a87b9222688a52a4f46605b82d8493bdff2f6774374e1647f8cfc Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.915931 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" event={"ID":"cfd471b6-339e-43ca-8fc7-06a9bee8aa46","Type":"ContainerStarted","Data":"6e37695736463b97f3f106c3ad89fdae6428267c56d0508be005bfa109aa9f2b"} Feb 28 03:45:00 crc kubenswrapper[4819]: I0228 03:45:00.916380 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" event={"ID":"cfd471b6-339e-43ca-8fc7-06a9bee8aa46","Type":"ContainerStarted","Data":"16472b842e7a87b9222688a52a4f46605b82d8493bdff2f6774374e1647f8cfc"} Feb 28 03:45:01 crc kubenswrapper[4819]: I0228 03:45:01.922038 4819 generic.go:334] "Generic (PLEG): container finished" podID="cfd471b6-339e-43ca-8fc7-06a9bee8aa46" containerID="6e37695736463b97f3f106c3ad89fdae6428267c56d0508be005bfa109aa9f2b" exitCode=0 Feb 28 03:45:01 crc kubenswrapper[4819]: I0228 03:45:01.922077 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" event={"ID":"cfd471b6-339e-43ca-8fc7-06a9bee8aa46","Type":"ContainerDied","Data":"6e37695736463b97f3f106c3ad89fdae6428267c56d0508be005bfa109aa9f2b"} Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.285584 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.361282 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wlvv\" (UniqueName: \"kubernetes.io/projected/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-kube-api-access-6wlvv\") pod \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.361347 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-config-volume\") pod \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.361424 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-secret-volume\") pod \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\" (UID: \"cfd471b6-339e-43ca-8fc7-06a9bee8aa46\") " Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.362206 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-config-volume" (OuterVolumeSpecName: "config-volume") pod "cfd471b6-339e-43ca-8fc7-06a9bee8aa46" (UID: "cfd471b6-339e-43ca-8fc7-06a9bee8aa46"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.367914 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cfd471b6-339e-43ca-8fc7-06a9bee8aa46" (UID: "cfd471b6-339e-43ca-8fc7-06a9bee8aa46"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.368201 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-kube-api-access-6wlvv" (OuterVolumeSpecName: "kube-api-access-6wlvv") pod "cfd471b6-339e-43ca-8fc7-06a9bee8aa46" (UID: "cfd471b6-339e-43ca-8fc7-06a9bee8aa46"). InnerVolumeSpecName "kube-api-access-6wlvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.463232 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wlvv\" (UniqueName: \"kubernetes.io/projected/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-kube-api-access-6wlvv\") on node \"crc\" DevicePath \"\"" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.463358 4819 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.463388 4819 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfd471b6-339e-43ca-8fc7-06a9bee8aa46-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.935550 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" event={"ID":"cfd471b6-339e-43ca-8fc7-06a9bee8aa46","Type":"ContainerDied","Data":"16472b842e7a87b9222688a52a4f46605b82d8493bdff2f6774374e1647f8cfc"} Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.935921 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16472b842e7a87b9222688a52a4f46605b82d8493bdff2f6774374e1647f8cfc" Feb 28 03:45:03 crc kubenswrapper[4819]: I0228 03:45:03.935677 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537505-dc99d" Feb 28 03:45:30 crc kubenswrapper[4819]: I0228 03:45:30.834350 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:45:30 crc kubenswrapper[4819]: I0228 03:45:30.834962 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.145824 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537506-m59k7"] Feb 28 03:46:00 crc kubenswrapper[4819]: E0228 03:46:00.146718 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd471b6-339e-43ca-8fc7-06a9bee8aa46" containerName="collect-profiles" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.146735 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd471b6-339e-43ca-8fc7-06a9bee8aa46" containerName="collect-profiles" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.146853 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd471b6-339e-43ca-8fc7-06a9bee8aa46" containerName="collect-profiles" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.147321 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.150077 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.150305 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.153338 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.155281 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537506-m59k7"] Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.342587 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxbc\" (UniqueName: \"kubernetes.io/projected/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f-kube-api-access-7rxbc\") pod \"auto-csr-approver-29537506-m59k7\" (UID: \"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f\") " pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.443766 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxbc\" (UniqueName: \"kubernetes.io/projected/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f-kube-api-access-7rxbc\") pod \"auto-csr-approver-29537506-m59k7\" (UID: \"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f\") " pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.467880 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxbc\" (UniqueName: \"kubernetes.io/projected/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f-kube-api-access-7rxbc\") pod \"auto-csr-approver-29537506-m59k7\" (UID: \"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f\") " pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.490444 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.731215 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537506-m59k7"] Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.833899 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:46:00 crc kubenswrapper[4819]: I0228 03:46:00.833946 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:46:01 crc kubenswrapper[4819]: I0228 03:46:01.361053 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537506-m59k7" event={"ID":"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f","Type":"ContainerStarted","Data":"2339869dfc06d4a16aa71ba47d5d50f22c2a3310e9546387d2d66215641907db"} Feb 28 03:46:02 crc kubenswrapper[4819]: I0228 03:46:02.372396 4819 generic.go:334] "Generic (PLEG): container finished" podID="3c27a9b1-4c0e-4860-9ce5-565c3f610d1f" containerID="5abee5d70577849f85b29cca4af7de719ed06cdde61734d4d8f214e9eb80f00d" exitCode=0 Feb 28 03:46:02 crc kubenswrapper[4819]: I0228 03:46:02.376584 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537506-m59k7" event={"ID":"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f","Type":"ContainerDied","Data":"5abee5d70577849f85b29cca4af7de719ed06cdde61734d4d8f214e9eb80f00d"} Feb 28 03:46:03 crc kubenswrapper[4819]: I0228 03:46:03.631681 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:03 crc kubenswrapper[4819]: I0228 03:46:03.791525 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rxbc\" (UniqueName: \"kubernetes.io/projected/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f-kube-api-access-7rxbc\") pod \"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f\" (UID: \"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f\") " Feb 28 03:46:03 crc kubenswrapper[4819]: I0228 03:46:03.797545 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f-kube-api-access-7rxbc" (OuterVolumeSpecName: "kube-api-access-7rxbc") pod "3c27a9b1-4c0e-4860-9ce5-565c3f610d1f" (UID: "3c27a9b1-4c0e-4860-9ce5-565c3f610d1f"). InnerVolumeSpecName "kube-api-access-7rxbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:46:03 crc kubenswrapper[4819]: I0228 03:46:03.893640 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rxbc\" (UniqueName: \"kubernetes.io/projected/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f-kube-api-access-7rxbc\") on node \"crc\" DevicePath \"\"" Feb 28 03:46:04 crc kubenswrapper[4819]: I0228 03:46:04.388280 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537506-m59k7" event={"ID":"3c27a9b1-4c0e-4860-9ce5-565c3f610d1f","Type":"ContainerDied","Data":"2339869dfc06d4a16aa71ba47d5d50f22c2a3310e9546387d2d66215641907db"} Feb 28 03:46:04 crc kubenswrapper[4819]: I0228 03:46:04.388351 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2339869dfc06d4a16aa71ba47d5d50f22c2a3310e9546387d2d66215641907db" Feb 28 03:46:04 crc kubenswrapper[4819]: I0228 03:46:04.388408 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537506-m59k7" Feb 28 03:46:04 crc kubenswrapper[4819]: I0228 03:46:04.704499 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537500-dnvc5"] Feb 28 03:46:04 crc kubenswrapper[4819]: I0228 03:46:04.713200 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537500-dnvc5"] Feb 28 03:46:06 crc kubenswrapper[4819]: I0228 03:46:06.382074 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf" path="/var/lib/kubelet/pods/4f4e6c5f-e674-4dfc-9590-5f7b02a1a1cf/volumes" Feb 28 03:46:30 crc kubenswrapper[4819]: I0228 03:46:30.834762 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:46:30 crc kubenswrapper[4819]: I0228 03:46:30.835442 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:46:30 crc kubenswrapper[4819]: I0228 03:46:30.835494 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:46:30 crc kubenswrapper[4819]: I0228 03:46:30.836160 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ade958a408b150f7a5061d92ed6be3f2480394555f495dbd9814681a29f7247"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 03:46:30 crc kubenswrapper[4819]: I0228 03:46:30.836231 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://5ade958a408b150f7a5061d92ed6be3f2480394555f495dbd9814681a29f7247" gracePeriod=600 Feb 28 03:46:31 crc kubenswrapper[4819]: I0228 03:46:31.578102 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="5ade958a408b150f7a5061d92ed6be3f2480394555f495dbd9814681a29f7247" exitCode=0 Feb 28 03:46:31 crc kubenswrapper[4819]: I0228 03:46:31.578332 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"5ade958a408b150f7a5061d92ed6be3f2480394555f495dbd9814681a29f7247"} Feb 28 03:46:31 crc kubenswrapper[4819]: I0228 03:46:31.578493 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"5ab93bb2251f8a9fb9c9db9bc6189036f7bfbd545e0f1b6f246a96c7b8188206"} Feb 28 03:46:31 crc kubenswrapper[4819]: I0228 03:46:31.578520 4819 scope.go:117] "RemoveContainer" containerID="2e1444d8a76cee2b7dbf599d5d429088251da1e9a8f9ace55b15b8ab10db4eaf" Feb 28 03:46:45 crc kubenswrapper[4819]: I0228 03:46:45.298867 4819 scope.go:117] "RemoveContainer" containerID="0dc197adab1f232babb74084e1a0fec8e9c06953be31dbbc0e5e849f6234a129" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.138983 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njv8f"] Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142235 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-controller" containerID="cri-o://f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142376 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142441 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="sbdb" containerID="cri-o://3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142487 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-node" containerID="cri-o://0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142465 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="northd" containerID="cri-o://ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142372 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="nbdb" containerID="cri-o://1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.142596 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-acl-logging" containerID="cri-o://cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.208563 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" containerID="cri-o://4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" gracePeriod=30 Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.461541 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/3.log" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.463933 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovn-acl-logging/0.log" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.464546 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovn-controller/0.log" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.465137 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532528 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532596 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-systemd\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532628 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-openvswitch\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532669 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9lwq\" (UniqueName: \"kubernetes.io/projected/caffcb28-383d-4424-a641-7dd1f36080c8-kube-api-access-h9lwq\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532707 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-var-lib-openvswitch\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532738 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-systemd-units\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532774 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-bin\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532889 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-env-overrides\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532918 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-kubelet\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532950 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-ovn-kubernetes\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.532983 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-script-lib\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533020 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-config\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533051 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-ovn\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533080 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-node-log\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533123 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-log-socket\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533152 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-etc-openvswitch\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533179 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-slash\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533210 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caffcb28-383d-4424-a641-7dd1f36080c8-ovn-node-metrics-cert\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533273 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-netns\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533303 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-netd\") pod \"caffcb28-383d-4424-a641-7dd1f36080c8\" (UID: \"caffcb28-383d-4424-a641-7dd1f36080c8\") " Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533555 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533699 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533858 4819 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533885 4819 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.533936 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.534363 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-log-socket" (OuterVolumeSpecName: "log-socket") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.534651 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.534708 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.534907 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-slash" (OuterVolumeSpecName: "host-slash") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.534956 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535027 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535068 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535093 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535107 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535140 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535176 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-node-log" (OuterVolumeSpecName: "node-log") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535214 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.535932 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.536456 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.543059 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-548t6"] Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.543891 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-node" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.543940 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-node" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.543988 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544008 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544046 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c27a9b1-4c0e-4860-9ce5-565c3f610d1f" containerName="oc" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544063 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c27a9b1-4c0e-4860-9ce5-565c3f610d1f" containerName="oc" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544101 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kubecfg-setup" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544120 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kubecfg-setup" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544155 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544171 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544190 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-acl-logging" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544210 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-acl-logging" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544345 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544395 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544432 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="nbdb" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544451 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="nbdb" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544498 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544516 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544498 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caffcb28-383d-4424-a641-7dd1f36080c8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544551 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="sbdb" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544571 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="sbdb" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544609 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544625 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544652 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="northd" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544669 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="northd" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544672 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caffcb28-383d-4424-a641-7dd1f36080c8-kube-api-access-h9lwq" (OuterVolumeSpecName: "kube-api-access-h9lwq") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "kube-api-access-h9lwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.544717 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.544735 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545384 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="nbdb" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545444 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545477 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545501 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545521 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c27a9b1-4c0e-4860-9ce5-565c3f610d1f" containerName="oc" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545555 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-node" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545598 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545631 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovn-acl-logging" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545656 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545688 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="kube-rbac-proxy-ovn-metrics" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545728 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="northd" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.545764 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="sbdb" Feb 28 03:47:12 crc kubenswrapper[4819]: E0228 03:47:12.546281 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.546316 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.546869 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" containerName="ovnkube-controller" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.565443 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.576051 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "caffcb28-383d-4424-a641-7dd1f36080c8" (UID: "caffcb28-383d-4424-a641-7dd1f36080c8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635368 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-log-socket\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635538 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-kubelet\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635583 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-cni-netd\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635618 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-ovnkube-config\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635661 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635698 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-ovnkube-script-lib\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635866 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfdnc\" (UniqueName: \"kubernetes.io/projected/9837d557-b014-4b4f-9e21-2f087333004a-kube-api-access-wfdnc\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635919 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-slash\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635939 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-run-netns\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635963 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.635982 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-ovn\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636011 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9837d557-b014-4b4f-9e21-2f087333004a-ovn-node-metrics-cert\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636037 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-systemd\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636060 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-env-overrides\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636087 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-systemd-units\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636113 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-cni-bin\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636136 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-node-log\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636157 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-var-lib-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636180 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-run-ovn-kubernetes\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636279 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-etc-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636580 4819 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636637 4819 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636663 4819 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636682 4819 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636705 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9lwq\" (UniqueName: \"kubernetes.io/projected/caffcb28-383d-4424-a641-7dd1f36080c8-kube-api-access-h9lwq\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636724 4819 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636742 4819 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636758 4819 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636775 4819 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636793 4819 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636823 4819 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636841 4819 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caffcb28-383d-4424-a641-7dd1f36080c8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636858 4819 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636875 4819 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-node-log\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636893 4819 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-log-socket\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636909 4819 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636926 4819 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caffcb28-383d-4424-a641-7dd1f36080c8-host-slash\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.636943 4819 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caffcb28-383d-4424-a641-7dd1f36080c8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.737842 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.737971 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738055 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-ovnkube-script-lib\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738118 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfdnc\" (UniqueName: \"kubernetes.io/projected/9837d557-b014-4b4f-9e21-2f087333004a-kube-api-access-wfdnc\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738220 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-slash\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738406 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-run-netns\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738523 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738570 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-ovn\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738636 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9837d557-b014-4b4f-9e21-2f087333004a-ovn-node-metrics-cert\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738695 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-systemd\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738739 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-env-overrides\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738792 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-systemd-units\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738848 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-cni-bin\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738894 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-node-log\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738938 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-var-lib-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.738984 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-run-ovn-kubernetes\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.739045 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-etc-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.739094 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-log-socket\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.739137 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-kubelet\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.739181 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-cni-netd\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.739236 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-ovnkube-config\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.739787 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-ovnkube-script-lib\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.740055 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-systemd-units\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.740552 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-slash\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.740617 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-run-netns\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.740666 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.740713 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-ovn\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741220 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-ovnkube-config\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741381 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-cni-bin\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741455 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-node-log\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741516 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-var-lib-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741572 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-kubelet\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741585 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-log-socket\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741615 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-cni-netd\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741657 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-host-run-ovn-kubernetes\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741708 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-etc-openvswitch\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.741746 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9837d557-b014-4b4f-9e21-2f087333004a-run-systemd\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.743006 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9837d557-b014-4b4f-9e21-2f087333004a-env-overrides\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.746481 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9837d557-b014-4b4f-9e21-2f087333004a-ovn-node-metrics-cert\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.763771 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfdnc\" (UniqueName: \"kubernetes.io/projected/9837d557-b014-4b4f-9e21-2f087333004a-kube-api-access-wfdnc\") pod \"ovnkube-node-548t6\" (UID: \"9837d557-b014-4b4f-9e21-2f087333004a\") " pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:12 crc kubenswrapper[4819]: I0228 03:47:12.888772 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.093106 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovnkube-controller/3.log" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.097471 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovn-acl-logging/0.log" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.103157 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njv8f_caffcb28-383d-4424-a641-7dd1f36080c8/ovn-controller/0.log" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106145 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106198 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106216 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106234 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106273 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106287 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106299 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" exitCode=143 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106312 4819 generic.go:334] "Generic (PLEG): container finished" podID="caffcb28-383d-4424-a641-7dd1f36080c8" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" exitCode=143 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106406 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106463 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106487 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106508 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106528 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106547 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106566 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106583 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106595 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106607 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106617 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106628 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106638 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106649 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106659 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106673 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106693 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106705 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106715 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106726 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106737 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106747 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106758 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106770 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106780 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106791 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106806 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106822 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106835 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106846 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106856 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106867 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106879 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106890 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106901 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106911 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106921 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106935 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" event={"ID":"caffcb28-383d-4424-a641-7dd1f36080c8","Type":"ContainerDied","Data":"e3e2406e1911f02504bc0a19e1dcf9519fbe19f9ffdff9a439c32bd9b86f3f19"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106951 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106962 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106972 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106984 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.106996 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.107007 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.107018 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.107028 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.107038 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.107050 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.107072 4819 scope.go:117] "RemoveContainer" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.111967 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njv8f" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.114051 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/2.log" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.121526 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/1.log" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.121637 4819 generic.go:334] "Generic (PLEG): container finished" podID="78f6484e-91d1-4345-baad-9f39f49a3915" containerID="2e7f8be7b64993d771c7dd876fa6a871ff577a0eb29ba3ede7b6b602e19a1fd5" exitCode=2 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.121821 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerDied","Data":"2e7f8be7b64993d771c7dd876fa6a871ff577a0eb29ba3ede7b6b602e19a1fd5"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.121868 4819 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.122720 4819 scope.go:117] "RemoveContainer" containerID="2e7f8be7b64993d771c7dd876fa6a871ff577a0eb29ba3ede7b6b602e19a1fd5" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.123033 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5ldpg_openshift-multus(78f6484e-91d1-4345-baad-9f39f49a3915)\"" pod="openshift-multus/multus-5ldpg" podUID="78f6484e-91d1-4345-baad-9f39f49a3915" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.125123 4819 generic.go:334] "Generic (PLEG): container finished" podID="9837d557-b014-4b4f-9e21-2f087333004a" containerID="361a8ab55c31cac4ee5c928f525a584cfaeaa9dd171fb20eccfaf58811c1c850" exitCode=0 Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.125171 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerDied","Data":"361a8ab55c31cac4ee5c928f525a584cfaeaa9dd171fb20eccfaf58811c1c850"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.125203 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"85183ceb37d36c97964d6d833d5dd275d9744884d8880bef0541c5b49ba52b08"} Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.151939 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.205178 4819 scope.go:117] "RemoveContainer" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.226436 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njv8f"] Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.237375 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njv8f"] Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.240458 4819 scope.go:117] "RemoveContainer" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.263110 4819 scope.go:117] "RemoveContainer" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.277882 4819 scope.go:117] "RemoveContainer" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.323534 4819 scope.go:117] "RemoveContainer" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.346306 4819 scope.go:117] "RemoveContainer" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.374609 4819 scope.go:117] "RemoveContainer" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.418459 4819 scope.go:117] "RemoveContainer" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.439365 4819 scope.go:117] "RemoveContainer" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.439972 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": container with ID starting with 4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8 not found: ID does not exist" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.440028 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} err="failed to get container status \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": rpc error: code = NotFound desc = could not find container \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": container with ID starting with 4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.440062 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.440565 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": container with ID starting with 876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65 not found: ID does not exist" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.440643 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} err="failed to get container status \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": rpc error: code = NotFound desc = could not find container \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": container with ID starting with 876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.440700 4819 scope.go:117] "RemoveContainer" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.441456 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": container with ID starting with 3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde not found: ID does not exist" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.441495 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} err="failed to get container status \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": rpc error: code = NotFound desc = could not find container \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": container with ID starting with 3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.441515 4819 scope.go:117] "RemoveContainer" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.441914 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": container with ID starting with 1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b not found: ID does not exist" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.441988 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} err="failed to get container status \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": rpc error: code = NotFound desc = could not find container \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": container with ID starting with 1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.442028 4819 scope.go:117] "RemoveContainer" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.442626 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": container with ID starting with ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864 not found: ID does not exist" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.442735 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} err="failed to get container status \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": rpc error: code = NotFound desc = could not find container \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": container with ID starting with ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.442780 4819 scope.go:117] "RemoveContainer" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.443359 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": container with ID starting with e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6 not found: ID does not exist" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.443390 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} err="failed to get container status \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": rpc error: code = NotFound desc = could not find container \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": container with ID starting with e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.443417 4819 scope.go:117] "RemoveContainer" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.443678 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": container with ID starting with 0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5 not found: ID does not exist" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.443724 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} err="failed to get container status \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": rpc error: code = NotFound desc = could not find container \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": container with ID starting with 0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.443753 4819 scope.go:117] "RemoveContainer" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.444577 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": container with ID starting with cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e not found: ID does not exist" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.444606 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} err="failed to get container status \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": rpc error: code = NotFound desc = could not find container \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": container with ID starting with cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.444623 4819 scope.go:117] "RemoveContainer" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.445017 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": container with ID starting with f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e not found: ID does not exist" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.445068 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} err="failed to get container status \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": rpc error: code = NotFound desc = could not find container \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": container with ID starting with f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.445098 4819 scope.go:117] "RemoveContainer" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" Feb 28 03:47:13 crc kubenswrapper[4819]: E0228 03:47:13.445690 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": container with ID starting with 52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc not found: ID does not exist" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.445719 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} err="failed to get container status \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": rpc error: code = NotFound desc = could not find container \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": container with ID starting with 52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.445736 4819 scope.go:117] "RemoveContainer" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.446089 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} err="failed to get container status \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": rpc error: code = NotFound desc = could not find container \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": container with ID starting with 4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.446106 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.446459 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} err="failed to get container status \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": rpc error: code = NotFound desc = could not find container \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": container with ID starting with 876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.446496 4819 scope.go:117] "RemoveContainer" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.446916 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} err="failed to get container status \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": rpc error: code = NotFound desc = could not find container \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": container with ID starting with 3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.446957 4819 scope.go:117] "RemoveContainer" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.447593 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} err="failed to get container status \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": rpc error: code = NotFound desc = could not find container \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": container with ID starting with 1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.447631 4819 scope.go:117] "RemoveContainer" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.448010 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} err="failed to get container status \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": rpc error: code = NotFound desc = could not find container \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": container with ID starting with ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.448037 4819 scope.go:117] "RemoveContainer" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.448374 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} err="failed to get container status \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": rpc error: code = NotFound desc = could not find container \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": container with ID starting with e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.448397 4819 scope.go:117] "RemoveContainer" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.448896 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} err="failed to get container status \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": rpc error: code = NotFound desc = could not find container \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": container with ID starting with 0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.448923 4819 scope.go:117] "RemoveContainer" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.449331 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} err="failed to get container status \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": rpc error: code = NotFound desc = could not find container \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": container with ID starting with cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.449395 4819 scope.go:117] "RemoveContainer" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.450138 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} err="failed to get container status \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": rpc error: code = NotFound desc = could not find container \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": container with ID starting with f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.450190 4819 scope.go:117] "RemoveContainer" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.450619 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} err="failed to get container status \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": rpc error: code = NotFound desc = could not find container \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": container with ID starting with 52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.450639 4819 scope.go:117] "RemoveContainer" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.450972 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} err="failed to get container status \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": rpc error: code = NotFound desc = could not find container \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": container with ID starting with 4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.451028 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.451528 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} err="failed to get container status \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": rpc error: code = NotFound desc = could not find container \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": container with ID starting with 876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.451556 4819 scope.go:117] "RemoveContainer" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.452033 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} err="failed to get container status \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": rpc error: code = NotFound desc = could not find container \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": container with ID starting with 3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.452086 4819 scope.go:117] "RemoveContainer" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.452721 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} err="failed to get container status \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": rpc error: code = NotFound desc = could not find container \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": container with ID starting with 1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.452764 4819 scope.go:117] "RemoveContainer" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.453551 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} err="failed to get container status \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": rpc error: code = NotFound desc = could not find container \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": container with ID starting with ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.453596 4819 scope.go:117] "RemoveContainer" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.454357 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} err="failed to get container status \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": rpc error: code = NotFound desc = could not find container \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": container with ID starting with e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.454419 4819 scope.go:117] "RemoveContainer" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.454950 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} err="failed to get container status \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": rpc error: code = NotFound desc = could not find container \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": container with ID starting with 0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.454994 4819 scope.go:117] "RemoveContainer" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.456374 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} err="failed to get container status \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": rpc error: code = NotFound desc = could not find container \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": container with ID starting with cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.456412 4819 scope.go:117] "RemoveContainer" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.456952 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} err="failed to get container status \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": rpc error: code = NotFound desc = could not find container \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": container with ID starting with f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.456979 4819 scope.go:117] "RemoveContainer" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.457447 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} err="failed to get container status \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": rpc error: code = NotFound desc = could not find container \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": container with ID starting with 52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.457481 4819 scope.go:117] "RemoveContainer" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.457985 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} err="failed to get container status \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": rpc error: code = NotFound desc = could not find container \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": container with ID starting with 4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.458053 4819 scope.go:117] "RemoveContainer" containerID="876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.458748 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65"} err="failed to get container status \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": rpc error: code = NotFound desc = could not find container \"876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65\": container with ID starting with 876db3b10999f512ee1176cefe1a96805401995c72331403ec4a2bdf27466f65 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.458789 4819 scope.go:117] "RemoveContainer" containerID="3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.459279 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde"} err="failed to get container status \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": rpc error: code = NotFound desc = could not find container \"3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde\": container with ID starting with 3640fac033874fb181a66fd32decae54a482d680c7c4e1bf921986cec965dfde not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.459335 4819 scope.go:117] "RemoveContainer" containerID="1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.459829 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b"} err="failed to get container status \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": rpc error: code = NotFound desc = could not find container \"1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b\": container with ID starting with 1e1f1f033799d06ddb8cff561cfb377a564375039d5e44f0f6e4a2d86d79ed0b not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.459860 4819 scope.go:117] "RemoveContainer" containerID="ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.460293 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864"} err="failed to get container status \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": rpc error: code = NotFound desc = could not find container \"ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864\": container with ID starting with ccbc0a9d3771cef129a21adbb2260eee766f7e59c7a194927426bbe514790864 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.460328 4819 scope.go:117] "RemoveContainer" containerID="e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.460916 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6"} err="failed to get container status \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": rpc error: code = NotFound desc = could not find container \"e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6\": container with ID starting with e79898693042002394956b214aa353fad921136a9ceaea917c08cb636becc0d6 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.460948 4819 scope.go:117] "RemoveContainer" containerID="0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.461512 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5"} err="failed to get container status \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": rpc error: code = NotFound desc = could not find container \"0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5\": container with ID starting with 0e14862141ed28caa2282eb28632e1afd11c0407d8315f47edd47122d3c989f5 not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.461569 4819 scope.go:117] "RemoveContainer" containerID="cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.462050 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e"} err="failed to get container status \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": rpc error: code = NotFound desc = could not find container \"cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e\": container with ID starting with cfa5e1d70456ae58e56de1bd0d78d2f12fffe0588047e366d4f51919b271725e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.462079 4819 scope.go:117] "RemoveContainer" containerID="f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.462734 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e"} err="failed to get container status \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": rpc error: code = NotFound desc = could not find container \"f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e\": container with ID starting with f32b42310df80d319a6cf74c957892fea39daa488a60d2878cb87709f059340e not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.462796 4819 scope.go:117] "RemoveContainer" containerID="52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.463520 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc"} err="failed to get container status \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": rpc error: code = NotFound desc = could not find container \"52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc\": container with ID starting with 52ad70b3667fc8a3c18d2a717b86e6680fd9f07056ffb3f0963e71d064e43ccc not found: ID does not exist" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.463583 4819 scope.go:117] "RemoveContainer" containerID="4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8" Feb 28 03:47:13 crc kubenswrapper[4819]: I0228 03:47:13.464201 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8"} err="failed to get container status \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": rpc error: code = NotFound desc = could not find container \"4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8\": container with ID starting with 4d0a686a9bfa836ded4fb76c7ea4849904195866d8cce892b2fc7a5d4199d5e8 not found: ID does not exist" Feb 28 03:47:14 crc kubenswrapper[4819]: I0228 03:47:14.137543 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"f1369ad36098e72319206e31d8312cc894bab1269a718844e5e7b137e1c108d0"} Feb 28 03:47:14 crc kubenswrapper[4819]: I0228 03:47:14.138229 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"dc271fd1a9889eabb1bf19b75d0a1f95250c37dc74a048580964a91991440e7d"} Feb 28 03:47:14 crc kubenswrapper[4819]: I0228 03:47:14.138281 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"78b391c4db152eeb6d195f508c37f05c9d69e60d80c7c3a8251549d2f12ce929"} Feb 28 03:47:14 crc kubenswrapper[4819]: I0228 03:47:14.138300 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"5be3c6a84a17b2eae034ba58e5e4b662bb113dbf03a183f7180378118c8781a3"} Feb 28 03:47:14 crc kubenswrapper[4819]: I0228 03:47:14.138320 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"b0b2e6d726d64c6ae10738132b8baaa3f9b72fe57473df0f39fd546330554fa7"} Feb 28 03:47:14 crc kubenswrapper[4819]: I0228 03:47:14.407685 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caffcb28-383d-4424-a641-7dd1f36080c8" path="/var/lib/kubelet/pods/caffcb28-383d-4424-a641-7dd1f36080c8/volumes" Feb 28 03:47:15 crc kubenswrapper[4819]: I0228 03:47:15.154337 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"73f0f2ffbe903b0f43829634ce0a95297cff996c34dee5aa6f80264cd0195e85"} Feb 28 03:47:17 crc kubenswrapper[4819]: I0228 03:47:17.174377 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"055ac99360a2673110acce5345a3cff5959fefeb7c2da70fb9f51f093ea3b33b"} Feb 28 03:47:19 crc kubenswrapper[4819]: I0228 03:47:19.191341 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" event={"ID":"9837d557-b014-4b4f-9e21-2f087333004a","Type":"ContainerStarted","Data":"6befeb16871a5d4139b3f2ec647d606450bacc5e7c3e33809504b32feaa35a7b"} Feb 28 03:47:19 crc kubenswrapper[4819]: I0228 03:47:19.191989 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:19 crc kubenswrapper[4819]: I0228 03:47:19.219951 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" podStartSLOduration=7.219929983 podStartE2EDuration="7.219929983s" podCreationTimestamp="2026-02-28 03:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:47:19.217796921 +0000 UTC m=+777.683365789" watchObservedRunningTime="2026-02-28 03:47:19.219929983 +0000 UTC m=+777.685498851" Feb 28 03:47:19 crc kubenswrapper[4819]: I0228 03:47:19.228570 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:20 crc kubenswrapper[4819]: I0228 03:47:20.202477 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:20 crc kubenswrapper[4819]: I0228 03:47:20.203006 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:20 crc kubenswrapper[4819]: I0228 03:47:20.281295 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.368601 4819 scope.go:117] "RemoveContainer" containerID="2e7f8be7b64993d771c7dd876fa6a871ff577a0eb29ba3ede7b6b602e19a1fd5" Feb 28 03:47:25 crc kubenswrapper[4819]: E0228 03:47:25.369437 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5ldpg_openshift-multus(78f6484e-91d1-4345-baad-9f39f49a3915)\"" pod="openshift-multus/multus-5ldpg" podUID="78f6484e-91d1-4345-baad-9f39f49a3915" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.664341 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq"] Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.665302 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.667638 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.677734 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq"] Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.758656 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kclvl\" (UniqueName: \"kubernetes.io/projected/434b71de-3f2c-4820-943d-4b3b20e82fe2-kube-api-access-kclvl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.758919 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.759030 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.860030 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kclvl\" (UniqueName: \"kubernetes.io/projected/434b71de-3f2c-4820-943d-4b3b20e82fe2-kube-api-access-kclvl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.860159 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.860209 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.861076 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.861144 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.896728 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kclvl\" (UniqueName: \"kubernetes.io/projected/434b71de-3f2c-4820-943d-4b3b20e82fe2-kube-api-access-kclvl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:25 crc kubenswrapper[4819]: I0228 03:47:25.979943 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.020311 4819 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(8e77eab100fd496455dec055c4e3f8e0a9b3282f9e7a097c4457a55b70c00491): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.020439 4819 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(8e77eab100fd496455dec055c4e3f8e0a9b3282f9e7a097c4457a55b70c00491): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.020496 4819 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(8e77eab100fd496455dec055c4e3f8e0a9b3282f9e7a097c4457a55b70c00491): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.020600 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace(434b71de-3f2c-4820-943d-4b3b20e82fe2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace(434b71de-3f2c-4820-943d-4b3b20e82fe2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(8e77eab100fd496455dec055c4e3f8e0a9b3282f9e7a097c4457a55b70c00491): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" Feb 28 03:47:26 crc kubenswrapper[4819]: I0228 03:47:26.237636 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: I0228 03:47:26.238546 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.276766 4819 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(12d8dc7f169af8620c0068e484e7410a1af97dd94a3ac8dcd32a25045df1c045): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.276864 4819 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(12d8dc7f169af8620c0068e484e7410a1af97dd94a3ac8dcd32a25045df1c045): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.276912 4819 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(12d8dc7f169af8620c0068e484e7410a1af97dd94a3ac8dcd32a25045df1c045): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:26 crc kubenswrapper[4819]: E0228 03:47:26.277002 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace(434b71de-3f2c-4820-943d-4b3b20e82fe2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace(434b71de-3f2c-4820-943d-4b3b20e82fe2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(12d8dc7f169af8620c0068e484e7410a1af97dd94a3ac8dcd32a25045df1c045): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" Feb 28 03:47:38 crc kubenswrapper[4819]: I0228 03:47:38.369455 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:38 crc kubenswrapper[4819]: I0228 03:47:38.371045 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:38 crc kubenswrapper[4819]: E0228 03:47:38.418202 4819 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(600a17d613e25151bb48e135856f776fa41982fd13135ab5fac7419c15eb867c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 28 03:47:38 crc kubenswrapper[4819]: E0228 03:47:38.418948 4819 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(600a17d613e25151bb48e135856f776fa41982fd13135ab5fac7419c15eb867c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:38 crc kubenswrapper[4819]: E0228 03:47:38.419150 4819 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(600a17d613e25151bb48e135856f776fa41982fd13135ab5fac7419c15eb867c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:38 crc kubenswrapper[4819]: E0228 03:47:38.419473 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace(434b71de-3f2c-4820-943d-4b3b20e82fe2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace(434b71de-3f2c-4820-943d-4b3b20e82fe2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_openshift-marketplace_434b71de-3f2c-4820-943d-4b3b20e82fe2_0(600a17d613e25151bb48e135856f776fa41982fd13135ab5fac7419c15eb867c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" Feb 28 03:47:40 crc kubenswrapper[4819]: I0228 03:47:40.370071 4819 scope.go:117] "RemoveContainer" containerID="2e7f8be7b64993d771c7dd876fa6a871ff577a0eb29ba3ede7b6b602e19a1fd5" Feb 28 03:47:41 crc kubenswrapper[4819]: I0228 03:47:41.341082 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/2.log" Feb 28 03:47:41 crc kubenswrapper[4819]: I0228 03:47:41.342662 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/1.log" Feb 28 03:47:41 crc kubenswrapper[4819]: I0228 03:47:41.342761 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5ldpg" event={"ID":"78f6484e-91d1-4345-baad-9f39f49a3915","Type":"ContainerStarted","Data":"ae3a09e01cc2b053c22262ad55e366b8c569be10d8833e93befdbf8b572f92ea"} Feb 28 03:47:42 crc kubenswrapper[4819]: I0228 03:47:42.929884 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-548t6" Feb 28 03:47:45 crc kubenswrapper[4819]: I0228 03:47:45.366842 4819 scope.go:117] "RemoveContainer" containerID="5e3f42cd081ea5f0acc4ec1b50f311f2fb2506d6c4ad84745af6a581baffb8a1" Feb 28 03:47:46 crc kubenswrapper[4819]: I0228 03:47:46.388689 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5ldpg_78f6484e-91d1-4345-baad-9f39f49a3915/kube-multus/2.log" Feb 28 03:47:52 crc kubenswrapper[4819]: I0228 03:47:52.368769 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:52 crc kubenswrapper[4819]: I0228 03:47:52.374619 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:52 crc kubenswrapper[4819]: I0228 03:47:52.854202 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq"] Feb 28 03:47:52 crc kubenswrapper[4819]: W0228 03:47:52.857810 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434b71de_3f2c_4820_943d_4b3b20e82fe2.slice/crio-d3a86f4f8a9ef12d1ae90be5940b05ee21a98f8dc5c83d47e379b38e5353c7af WatchSource:0}: Error finding container d3a86f4f8a9ef12d1ae90be5940b05ee21a98f8dc5c83d47e379b38e5353c7af: Status 404 returned error can't find the container with id d3a86f4f8a9ef12d1ae90be5940b05ee21a98f8dc5c83d47e379b38e5353c7af Feb 28 03:47:53 crc kubenswrapper[4819]: I0228 03:47:53.437584 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" event={"ID":"434b71de-3f2c-4820-943d-4b3b20e82fe2","Type":"ContainerStarted","Data":"4a255d9577ed561a0b8c007fab53556d7190104c959bf46c6fae3fc75a975b1a"} Feb 28 03:47:53 crc kubenswrapper[4819]: I0228 03:47:53.438127 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" event={"ID":"434b71de-3f2c-4820-943d-4b3b20e82fe2","Type":"ContainerStarted","Data":"d3a86f4f8a9ef12d1ae90be5940b05ee21a98f8dc5c83d47e379b38e5353c7af"} Feb 28 03:47:54 crc kubenswrapper[4819]: I0228 03:47:54.446595 4819 generic.go:334] "Generic (PLEG): container finished" podID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerID="4a255d9577ed561a0b8c007fab53556d7190104c959bf46c6fae3fc75a975b1a" exitCode=0 Feb 28 03:47:54 crc kubenswrapper[4819]: I0228 03:47:54.446683 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" event={"ID":"434b71de-3f2c-4820-943d-4b3b20e82fe2","Type":"ContainerDied","Data":"4a255d9577ed561a0b8c007fab53556d7190104c959bf46c6fae3fc75a975b1a"} Feb 28 03:47:56 crc kubenswrapper[4819]: I0228 03:47:56.461530 4819 generic.go:334] "Generic (PLEG): container finished" podID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerID="edecc9acf1ec32f457f738ea59a1effd819f83277d07889e77631780a8b159d3" exitCode=0 Feb 28 03:47:56 crc kubenswrapper[4819]: I0228 03:47:56.461607 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" event={"ID":"434b71de-3f2c-4820-943d-4b3b20e82fe2","Type":"ContainerDied","Data":"edecc9acf1ec32f457f738ea59a1effd819f83277d07889e77631780a8b159d3"} Feb 28 03:47:57 crc kubenswrapper[4819]: I0228 03:47:57.469959 4819 generic.go:334] "Generic (PLEG): container finished" podID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerID="6d274e2dc9b98ff6c3702909299dd410d51808370d620346e96aa8d4138afe25" exitCode=0 Feb 28 03:47:57 crc kubenswrapper[4819]: I0228 03:47:57.470015 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" event={"ID":"434b71de-3f2c-4820-943d-4b3b20e82fe2","Type":"ContainerDied","Data":"6d274e2dc9b98ff6c3702909299dd410d51808370d620346e96aa8d4138afe25"} Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.774591 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.854002 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kclvl\" (UniqueName: \"kubernetes.io/projected/434b71de-3f2c-4820-943d-4b3b20e82fe2-kube-api-access-kclvl\") pod \"434b71de-3f2c-4820-943d-4b3b20e82fe2\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.854184 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-util\") pod \"434b71de-3f2c-4820-943d-4b3b20e82fe2\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.855593 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-bundle\") pod \"434b71de-3f2c-4820-943d-4b3b20e82fe2\" (UID: \"434b71de-3f2c-4820-943d-4b3b20e82fe2\") " Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.856866 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-bundle" (OuterVolumeSpecName: "bundle") pod "434b71de-3f2c-4820-943d-4b3b20e82fe2" (UID: "434b71de-3f2c-4820-943d-4b3b20e82fe2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.862126 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434b71de-3f2c-4820-943d-4b3b20e82fe2-kube-api-access-kclvl" (OuterVolumeSpecName: "kube-api-access-kclvl") pod "434b71de-3f2c-4820-943d-4b3b20e82fe2" (UID: "434b71de-3f2c-4820-943d-4b3b20e82fe2"). InnerVolumeSpecName "kube-api-access-kclvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.931472 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-util" (OuterVolumeSpecName: "util") pod "434b71de-3f2c-4820-943d-4b3b20e82fe2" (UID: "434b71de-3f2c-4820-943d-4b3b20e82fe2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.956962 4819 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.957009 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kclvl\" (UniqueName: \"kubernetes.io/projected/434b71de-3f2c-4820-943d-4b3b20e82fe2-kube-api-access-kclvl\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:58 crc kubenswrapper[4819]: I0228 03:47:58.957028 4819 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/434b71de-3f2c-4820-943d-4b3b20e82fe2-util\") on node \"crc\" DevicePath \"\"" Feb 28 03:47:59 crc kubenswrapper[4819]: I0228 03:47:59.486537 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" event={"ID":"434b71de-3f2c-4820-943d-4b3b20e82fe2","Type":"ContainerDied","Data":"d3a86f4f8a9ef12d1ae90be5940b05ee21a98f8dc5c83d47e379b38e5353c7af"} Feb 28 03:47:59 crc kubenswrapper[4819]: I0228 03:47:59.486948 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a86f4f8a9ef12d1ae90be5940b05ee21a98f8dc5c83d47e379b38e5353c7af" Feb 28 03:47:59 crc kubenswrapper[4819]: I0228 03:47:59.486585 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.154147 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537508-7zh58"] Feb 28 03:48:00 crc kubenswrapper[4819]: E0228 03:48:00.154485 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="pull" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.154507 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="pull" Feb 28 03:48:00 crc kubenswrapper[4819]: E0228 03:48:00.154524 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="util" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.154558 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="util" Feb 28 03:48:00 crc kubenswrapper[4819]: E0228 03:48:00.154586 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="extract" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.154598 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="extract" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.154740 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="434b71de-3f2c-4820-943d-4b3b20e82fe2" containerName="extract" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.155385 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.158163 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.159097 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.159344 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.166188 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537508-7zh58"] Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.276270 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptn5x\" (UniqueName: \"kubernetes.io/projected/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9-kube-api-access-ptn5x\") pod \"auto-csr-approver-29537508-7zh58\" (UID: \"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9\") " pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.378096 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptn5x\" (UniqueName: \"kubernetes.io/projected/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9-kube-api-access-ptn5x\") pod \"auto-csr-approver-29537508-7zh58\" (UID: \"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9\") " pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.397026 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptn5x\" (UniqueName: \"kubernetes.io/projected/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9-kube-api-access-ptn5x\") pod \"auto-csr-approver-29537508-7zh58\" (UID: \"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9\") " pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.482323 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:00 crc kubenswrapper[4819]: I0228 03:48:00.702464 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537508-7zh58"] Feb 28 03:48:00 crc kubenswrapper[4819]: W0228 03:48:00.709881 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68cab466_bfa3_441b_8c96_f1e3d4b3b4f9.slice/crio-77ce240942e6ce9f25c7e464585d7bafc0bc7f951f834c0a38e45b0951c79a9b WatchSource:0}: Error finding container 77ce240942e6ce9f25c7e464585d7bafc0bc7f951f834c0a38e45b0951c79a9b: Status 404 returned error can't find the container with id 77ce240942e6ce9f25c7e464585d7bafc0bc7f951f834c0a38e45b0951c79a9b Feb 28 03:48:01 crc kubenswrapper[4819]: I0228 03:48:01.503843 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537508-7zh58" event={"ID":"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9","Type":"ContainerStarted","Data":"77ce240942e6ce9f25c7e464585d7bafc0bc7f951f834c0a38e45b0951c79a9b"} Feb 28 03:48:02 crc kubenswrapper[4819]: I0228 03:48:02.510829 4819 generic.go:334] "Generic (PLEG): container finished" podID="68cab466-bfa3-441b-8c96-f1e3d4b3b4f9" containerID="bf95251f6ec7d2a5705c95048ed01fcaa6d3401462b3877b8500199ef85a2f72" exitCode=0 Feb 28 03:48:02 crc kubenswrapper[4819]: I0228 03:48:02.511073 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537508-7zh58" event={"ID":"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9","Type":"ContainerDied","Data":"bf95251f6ec7d2a5705c95048ed01fcaa6d3401462b3877b8500199ef85a2f72"} Feb 28 03:48:03 crc kubenswrapper[4819]: I0228 03:48:03.758322 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:03 crc kubenswrapper[4819]: I0228 03:48:03.820533 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptn5x\" (UniqueName: \"kubernetes.io/projected/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9-kube-api-access-ptn5x\") pod \"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9\" (UID: \"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9\") " Feb 28 03:48:03 crc kubenswrapper[4819]: I0228 03:48:03.832726 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9-kube-api-access-ptn5x" (OuterVolumeSpecName: "kube-api-access-ptn5x") pod "68cab466-bfa3-441b-8c96-f1e3d4b3b4f9" (UID: "68cab466-bfa3-441b-8c96-f1e3d4b3b4f9"). InnerVolumeSpecName "kube-api-access-ptn5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:48:03 crc kubenswrapper[4819]: I0228 03:48:03.922755 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptn5x\" (UniqueName: \"kubernetes.io/projected/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9-kube-api-access-ptn5x\") on node \"crc\" DevicePath \"\"" Feb 28 03:48:04 crc kubenswrapper[4819]: I0228 03:48:04.525230 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537508-7zh58" event={"ID":"68cab466-bfa3-441b-8c96-f1e3d4b3b4f9","Type":"ContainerDied","Data":"77ce240942e6ce9f25c7e464585d7bafc0bc7f951f834c0a38e45b0951c79a9b"} Feb 28 03:48:04 crc kubenswrapper[4819]: I0228 03:48:04.525315 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537508-7zh58" Feb 28 03:48:04 crc kubenswrapper[4819]: I0228 03:48:04.525340 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77ce240942e6ce9f25c7e464585d7bafc0bc7f951f834c0a38e45b0951c79a9b" Feb 28 03:48:04 crc kubenswrapper[4819]: I0228 03:48:04.835734 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537502-g97bp"] Feb 28 03:48:04 crc kubenswrapper[4819]: I0228 03:48:04.842553 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537502-g97bp"] Feb 28 03:48:06 crc kubenswrapper[4819]: I0228 03:48:06.376893 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6d3d52-2b23-41b9-856b-decde2205a0c" path="/var/lib/kubelet/pods/4b6d3d52-2b23-41b9-856b-decde2205a0c/volumes" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.071991 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2"] Feb 28 03:48:09 crc kubenswrapper[4819]: E0228 03:48:09.072205 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cab466-bfa3-441b-8c96-f1e3d4b3b4f9" containerName="oc" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.072219 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cab466-bfa3-441b-8c96-f1e3d4b3b4f9" containerName="oc" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.072361 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cab466-bfa3-441b-8c96-f1e3d4b3b4f9" containerName="oc" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.073014 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.074874 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-pkt9p" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.075176 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.075208 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.075211 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.075740 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.092860 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2"] Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.202091 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4384095a-345c-400a-bd62-0f8ca53b1ea3-webhook-cert\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.202138 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4384095a-345c-400a-bd62-0f8ca53b1ea3-apiservice-cert\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.202179 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc26h\" (UniqueName: \"kubernetes.io/projected/4384095a-345c-400a-bd62-0f8ca53b1ea3-kube-api-access-fc26h\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.303779 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4384095a-345c-400a-bd62-0f8ca53b1ea3-webhook-cert\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.303831 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4384095a-345c-400a-bd62-0f8ca53b1ea3-apiservice-cert\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.303870 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc26h\" (UniqueName: \"kubernetes.io/projected/4384095a-345c-400a-bd62-0f8ca53b1ea3-kube-api-access-fc26h\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.309710 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4384095a-345c-400a-bd62-0f8ca53b1ea3-webhook-cert\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.320278 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc26h\" (UniqueName: \"kubernetes.io/projected/4384095a-345c-400a-bd62-0f8ca53b1ea3-kube-api-access-fc26h\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.320597 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4384095a-345c-400a-bd62-0f8ca53b1ea3-apiservice-cert\") pod \"metallb-operator-controller-manager-7d5484c9c9-4vfr2\" (UID: \"4384095a-345c-400a-bd62-0f8ca53b1ea3\") " pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.387632 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.406934 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv"] Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.407554 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.410538 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.410649 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-54hzm" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.410691 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.424856 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv"] Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.506559 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55049d53-0610-4ce3-843d-209930ea1421-apiservice-cert\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.506611 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55049d53-0610-4ce3-843d-209930ea1421-webhook-cert\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.506650 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6l7g\" (UniqueName: \"kubernetes.io/projected/55049d53-0610-4ce3-843d-209930ea1421-kube-api-access-m6l7g\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.608802 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55049d53-0610-4ce3-843d-209930ea1421-apiservice-cert\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.609282 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55049d53-0610-4ce3-843d-209930ea1421-webhook-cert\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.609455 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6l7g\" (UniqueName: \"kubernetes.io/projected/55049d53-0610-4ce3-843d-209930ea1421-kube-api-access-m6l7g\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.617991 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55049d53-0610-4ce3-843d-209930ea1421-apiservice-cert\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.621787 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55049d53-0610-4ce3-843d-209930ea1421-webhook-cert\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.648909 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6l7g\" (UniqueName: \"kubernetes.io/projected/55049d53-0610-4ce3-843d-209930ea1421-kube-api-access-m6l7g\") pod \"metallb-operator-webhook-server-644d79b54d-t68wv\" (UID: \"55049d53-0610-4ce3-843d-209930ea1421\") " pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.776553 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2"] Feb 28 03:48:09 crc kubenswrapper[4819]: W0228 03:48:09.784765 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4384095a_345c_400a_bd62_0f8ca53b1ea3.slice/crio-3c985dbf21f1d6ce8f7d14d9d95d34a435ee72d5c5aad66a0469998a4eb0e364 WatchSource:0}: Error finding container 3c985dbf21f1d6ce8f7d14d9d95d34a435ee72d5c5aad66a0469998a4eb0e364: Status 404 returned error can't find the container with id 3c985dbf21f1d6ce8f7d14d9d95d34a435ee72d5c5aad66a0469998a4eb0e364 Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.787004 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:09 crc kubenswrapper[4819]: I0228 03:48:09.967327 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv"] Feb 28 03:48:09 crc kubenswrapper[4819]: W0228 03:48:09.975729 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55049d53_0610_4ce3_843d_209930ea1421.slice/crio-210568ffe93ff1d30d74a1aa498fdbca97a90b6583fbdf9971ace8ab63a7f337 WatchSource:0}: Error finding container 210568ffe93ff1d30d74a1aa498fdbca97a90b6583fbdf9971ace8ab63a7f337: Status 404 returned error can't find the container with id 210568ffe93ff1d30d74a1aa498fdbca97a90b6583fbdf9971ace8ab63a7f337 Feb 28 03:48:10 crc kubenswrapper[4819]: I0228 03:48:10.244847 4819 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 28 03:48:10 crc kubenswrapper[4819]: I0228 03:48:10.571801 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" event={"ID":"4384095a-345c-400a-bd62-0f8ca53b1ea3","Type":"ContainerStarted","Data":"3c985dbf21f1d6ce8f7d14d9d95d34a435ee72d5c5aad66a0469998a4eb0e364"} Feb 28 03:48:10 crc kubenswrapper[4819]: I0228 03:48:10.573057 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" event={"ID":"55049d53-0610-4ce3-843d-209930ea1421","Type":"ContainerStarted","Data":"210568ffe93ff1d30d74a1aa498fdbca97a90b6583fbdf9971ace8ab63a7f337"} Feb 28 03:48:13 crc kubenswrapper[4819]: I0228 03:48:13.597366 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" event={"ID":"4384095a-345c-400a-bd62-0f8ca53b1ea3","Type":"ContainerStarted","Data":"4cb8ef5287db3c51ac967c1d33dd9baa959464f24bea6e3c234dad63a6df4c29"} Feb 28 03:48:13 crc kubenswrapper[4819]: I0228 03:48:13.597772 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:15 crc kubenswrapper[4819]: I0228 03:48:15.612118 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" event={"ID":"55049d53-0610-4ce3-843d-209930ea1421","Type":"ContainerStarted","Data":"3de2115508f586174831546b9296ef6e4009f1316287265bd582a151abcd9821"} Feb 28 03:48:15 crc kubenswrapper[4819]: I0228 03:48:15.612873 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:15 crc kubenswrapper[4819]: I0228 03:48:15.633506 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" podStartSLOduration=3.445957435 podStartE2EDuration="6.633484694s" podCreationTimestamp="2026-02-28 03:48:09 +0000 UTC" firstStartedPulling="2026-02-28 03:48:09.788288501 +0000 UTC m=+828.253857359" lastFinishedPulling="2026-02-28 03:48:12.97581576 +0000 UTC m=+831.441384618" observedRunningTime="2026-02-28 03:48:13.620172547 +0000 UTC m=+832.085741415" watchObservedRunningTime="2026-02-28 03:48:15.633484694 +0000 UTC m=+834.099053562" Feb 28 03:48:15 crc kubenswrapper[4819]: I0228 03:48:15.634690 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" podStartSLOduration=2.056899458 podStartE2EDuration="6.634675113s" podCreationTimestamp="2026-02-28 03:48:09 +0000 UTC" firstStartedPulling="2026-02-28 03:48:09.9798049 +0000 UTC m=+828.445373768" lastFinishedPulling="2026-02-28 03:48:14.557580545 +0000 UTC m=+833.023149423" observedRunningTime="2026-02-28 03:48:15.630365107 +0000 UTC m=+834.095934005" watchObservedRunningTime="2026-02-28 03:48:15.634675113 +0000 UTC m=+834.100243981" Feb 28 03:48:29 crc kubenswrapper[4819]: I0228 03:48:29.817233 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-644d79b54d-t68wv" Feb 28 03:48:45 crc kubenswrapper[4819]: I0228 03:48:45.462635 4819 scope.go:117] "RemoveContainer" containerID="f7f3fb79f5c75f1d3347eace08b6396f6d2f6367488d29505e5a0285c6cc03ef" Feb 28 03:48:49 crc kubenswrapper[4819]: I0228 03:48:49.391082 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d5484c9c9-4vfr2" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.263028 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-n8zsj"] Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.266904 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.272306 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs"] Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.273341 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.277656 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.277811 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-m6ph8" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.277839 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.278589 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.293148 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs"] Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.323905 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d65903fa-28c3-4b9d-890d-7b605a91f0d8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.323971 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-sockets\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324008 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-startup\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324098 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324140 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56dhq\" (UniqueName: \"kubernetes.io/projected/24de1316-bb5e-4366-a897-7d4dc6349b1f-kube-api-access-56dhq\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324195 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-reloader\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324238 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-conf\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324287 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vth8v\" (UniqueName: \"kubernetes.io/projected/d65903fa-28c3-4b9d-890d-7b605a91f0d8-kube-api-access-vth8v\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.324376 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics-certs\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.359515 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zt2hg"] Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.360295 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.371162 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-sgqvl"] Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.372143 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.372581 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.373106 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-s6lvr" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.373335 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.373486 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.377291 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.382889 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-sgqvl"] Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425657 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425709 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdws\" (UniqueName: \"kubernetes.io/projected/3257ff95-9460-49e8-8ae9-9758e419abee-kube-api-access-5tdws\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425735 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics-certs\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425801 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqt59\" (UniqueName: \"kubernetes.io/projected/afd495b6-4c71-455d-ba10-061bbf630cc5-kube-api-access-fqt59\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425828 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d65903fa-28c3-4b9d-890d-7b605a91f0d8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425848 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afd495b6-4c71-455d-ba10-061bbf630cc5-cert\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.425904 4819 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.425921 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-sockets\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.425931 4819 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.425957 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics-certs podName:24de1316-bb5e-4366-a897-7d4dc6349b1f nodeName:}" failed. No retries permitted until 2026-02-28 03:48:50.925941936 +0000 UTC m=+869.391510794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics-certs") pod "frr-k8s-n8zsj" (UID: "24de1316-bb5e-4366-a897-7d4dc6349b1f") : secret "frr-k8s-certs-secret" not found Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426037 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-metrics-certs\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.426109 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d65903fa-28c3-4b9d-890d-7b605a91f0d8-cert podName:d65903fa-28c3-4b9d-890d-7b605a91f0d8 nodeName:}" failed. No retries permitted until 2026-02-28 03:48:50.9260864 +0000 UTC m=+869.391655268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d65903fa-28c3-4b9d-890d-7b605a91f0d8-cert") pod "frr-k8s-webhook-server-7f989f654f-xwnrs" (UID: "d65903fa-28c3-4b9d-890d-7b605a91f0d8") : secret "frr-k8s-webhook-server-cert" not found Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426065 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3257ff95-9460-49e8-8ae9-9758e419abee-metallb-excludel2\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426148 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afd495b6-4c71-455d-ba10-061bbf630cc5-metrics-certs\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426196 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56dhq\" (UniqueName: \"kubernetes.io/projected/24de1316-bb5e-4366-a897-7d4dc6349b1f-kube-api-access-56dhq\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426236 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-sockets\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426229 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-startup\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426293 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426314 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-reloader\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426333 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-conf\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426350 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vth8v\" (UniqueName: \"kubernetes.io/projected/d65903fa-28c3-4b9d-890d-7b605a91f0d8-kube-api-access-vth8v\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426565 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426645 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-reloader\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.426806 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-conf\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.427433 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/24de1316-bb5e-4366-a897-7d4dc6349b1f-frr-startup\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.446286 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vth8v\" (UniqueName: \"kubernetes.io/projected/d65903fa-28c3-4b9d-890d-7b605a91f0d8-kube-api-access-vth8v\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.447771 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56dhq\" (UniqueName: \"kubernetes.io/projected/24de1316-bb5e-4366-a897-7d4dc6349b1f-kube-api-access-56dhq\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536118 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqt59\" (UniqueName: \"kubernetes.io/projected/afd495b6-4c71-455d-ba10-061bbf630cc5-kube-api-access-fqt59\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536230 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afd495b6-4c71-455d-ba10-061bbf630cc5-cert\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536320 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-metrics-certs\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536354 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afd495b6-4c71-455d-ba10-061bbf630cc5-metrics-certs\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536385 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3257ff95-9460-49e8-8ae9-9758e419abee-metallb-excludel2\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536555 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.536580 4819 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.536647 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdws\" (UniqueName: \"kubernetes.io/projected/3257ff95-9460-49e8-8ae9-9758e419abee-kube-api-access-5tdws\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.536683 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-metrics-certs podName:3257ff95-9460-49e8-8ae9-9758e419abee nodeName:}" failed. No retries permitted until 2026-02-28 03:48:51.036656114 +0000 UTC m=+869.502224982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-metrics-certs") pod "speaker-zt2hg" (UID: "3257ff95-9460-49e8-8ae9-9758e419abee") : secret "speaker-certs-secret" not found Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.537007 4819 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 28 03:48:50 crc kubenswrapper[4819]: E0228 03:48:50.537081 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist podName:3257ff95-9460-49e8-8ae9-9758e419abee nodeName:}" failed. No retries permitted until 2026-02-28 03:48:51.037060424 +0000 UTC m=+869.502629282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist") pod "speaker-zt2hg" (UID: "3257ff95-9460-49e8-8ae9-9758e419abee") : secret "metallb-memberlist" not found Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.537683 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3257ff95-9460-49e8-8ae9-9758e419abee-metallb-excludel2\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.540755 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afd495b6-4c71-455d-ba10-061bbf630cc5-metrics-certs\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.551553 4819 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.560090 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afd495b6-4c71-455d-ba10-061bbf630cc5-cert\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.562720 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdws\" (UniqueName: \"kubernetes.io/projected/3257ff95-9460-49e8-8ae9-9758e419abee-kube-api-access-5tdws\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.565584 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqt59\" (UniqueName: \"kubernetes.io/projected/afd495b6-4c71-455d-ba10-061bbf630cc5-kube-api-access-fqt59\") pod \"controller-86ddb6bd46-sgqvl\" (UID: \"afd495b6-4c71-455d-ba10-061bbf630cc5\") " pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.702885 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.944961 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics-certs\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.945400 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d65903fa-28c3-4b9d-890d-7b605a91f0d8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.951393 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d65903fa-28c3-4b9d-890d-7b605a91f0d8-cert\") pod \"frr-k8s-webhook-server-7f989f654f-xwnrs\" (UID: \"d65903fa-28c3-4b9d-890d-7b605a91f0d8\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:50 crc kubenswrapper[4819]: I0228 03:48:50.952711 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24de1316-bb5e-4366-a897-7d4dc6349b1f-metrics-certs\") pod \"frr-k8s-n8zsj\" (UID: \"24de1316-bb5e-4366-a897-7d4dc6349b1f\") " pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.046468 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-metrics-certs\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.046529 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:51 crc kubenswrapper[4819]: E0228 03:48:51.046669 4819 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 28 03:48:51 crc kubenswrapper[4819]: E0228 03:48:51.046713 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist podName:3257ff95-9460-49e8-8ae9-9758e419abee nodeName:}" failed. No retries permitted until 2026-02-28 03:48:52.046699741 +0000 UTC m=+870.512268599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist") pod "speaker-zt2hg" (UID: "3257ff95-9460-49e8-8ae9-9758e419abee") : secret "metallb-memberlist" not found Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.051901 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-metrics-certs\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.195760 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.207017 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.236432 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-sgqvl"] Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.629810 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs"] Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.861840 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"f442519c1cf48c3215ac09ea7e1af42e57ee1e1c1feff46ba95f901d5ab15f60"} Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.864358 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" event={"ID":"d65903fa-28c3-4b9d-890d-7b605a91f0d8","Type":"ContainerStarted","Data":"08bbcc5a952b28ab84bb790b95b2d6ad58a0505016bd48a030caf29832d0a34f"} Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.866914 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sgqvl" event={"ID":"afd495b6-4c71-455d-ba10-061bbf630cc5","Type":"ContainerStarted","Data":"ea8bc61e84af8b77032990f9092ad88aa05f5a5ea63b9a07a691805eb9f2c988"} Feb 28 03:48:51 crc kubenswrapper[4819]: I0228 03:48:51.866996 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sgqvl" event={"ID":"afd495b6-4c71-455d-ba10-061bbf630cc5","Type":"ContainerStarted","Data":"c67d969e0f8e54f5a9e91caaa56f20c42d8db6751c05c530cec804c14158810d"} Feb 28 03:48:52 crc kubenswrapper[4819]: I0228 03:48:52.068793 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:52 crc kubenswrapper[4819]: I0228 03:48:52.074531 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3257ff95-9460-49e8-8ae9-9758e419abee-memberlist\") pod \"speaker-zt2hg\" (UID: \"3257ff95-9460-49e8-8ae9-9758e419abee\") " pod="metallb-system/speaker-zt2hg" Feb 28 03:48:52 crc kubenswrapper[4819]: I0228 03:48:52.186236 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zt2hg" Feb 28 03:48:52 crc kubenswrapper[4819]: W0228 03:48:52.233710 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3257ff95_9460_49e8_8ae9_9758e419abee.slice/crio-8d8ffd31a1652e9f904405297c22ac7f7cbbf37b218fceda4bc5dd271604ebdb WatchSource:0}: Error finding container 8d8ffd31a1652e9f904405297c22ac7f7cbbf37b218fceda4bc5dd271604ebdb: Status 404 returned error can't find the container with id 8d8ffd31a1652e9f904405297c22ac7f7cbbf37b218fceda4bc5dd271604ebdb Feb 28 03:48:52 crc kubenswrapper[4819]: I0228 03:48:52.878229 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zt2hg" event={"ID":"3257ff95-9460-49e8-8ae9-9758e419abee","Type":"ContainerStarted","Data":"8742f54fa909859a67e522f3ee75d1b7236c7cfd6b62397b401d3b5aa4e6e5bc"} Feb 28 03:48:52 crc kubenswrapper[4819]: I0228 03:48:52.878365 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zt2hg" event={"ID":"3257ff95-9460-49e8-8ae9-9758e419abee","Type":"ContainerStarted","Data":"8d8ffd31a1652e9f904405297c22ac7f7cbbf37b218fceda4bc5dd271604ebdb"} Feb 28 03:48:55 crc kubenswrapper[4819]: I0228 03:48:55.904543 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zt2hg" event={"ID":"3257ff95-9460-49e8-8ae9-9758e419abee","Type":"ContainerStarted","Data":"062463080f86225ead6ad0897fce61a05a19e56eda452ea3c7575d0c99cad790"} Feb 28 03:48:55 crc kubenswrapper[4819]: I0228 03:48:55.904911 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zt2hg" Feb 28 03:48:55 crc kubenswrapper[4819]: I0228 03:48:55.907494 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sgqvl" event={"ID":"afd495b6-4c71-455d-ba10-061bbf630cc5","Type":"ContainerStarted","Data":"5598b5b3d3a5b051b684f8c7c37c8d8c4613ff50404ffa2ffb3b2402ebed8b53"} Feb 28 03:48:55 crc kubenswrapper[4819]: I0228 03:48:55.907634 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:48:55 crc kubenswrapper[4819]: I0228 03:48:55.919689 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zt2hg" podStartSLOduration=3.46429467 podStartE2EDuration="5.91967428s" podCreationTimestamp="2026-02-28 03:48:50 +0000 UTC" firstStartedPulling="2026-02-28 03:48:52.584462911 +0000 UTC m=+871.050031769" lastFinishedPulling="2026-02-28 03:48:55.039842521 +0000 UTC m=+873.505411379" observedRunningTime="2026-02-28 03:48:55.917076846 +0000 UTC m=+874.382645704" watchObservedRunningTime="2026-02-28 03:48:55.91967428 +0000 UTC m=+874.385243138" Feb 28 03:48:55 crc kubenswrapper[4819]: I0228 03:48:55.937505 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-sgqvl" podStartSLOduration=2.282628154 podStartE2EDuration="5.937483178s" podCreationTimestamp="2026-02-28 03:48:50 +0000 UTC" firstStartedPulling="2026-02-28 03:48:51.377472331 +0000 UTC m=+869.843041209" lastFinishedPulling="2026-02-28 03:48:55.032327375 +0000 UTC m=+873.497896233" observedRunningTime="2026-02-28 03:48:55.934184317 +0000 UTC m=+874.399753175" watchObservedRunningTime="2026-02-28 03:48:55.937483178 +0000 UTC m=+874.403052046" Feb 28 03:48:59 crc kubenswrapper[4819]: I0228 03:48:59.978978 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" event={"ID":"d65903fa-28c3-4b9d-890d-7b605a91f0d8","Type":"ContainerStarted","Data":"9c84a5a6d69968987f41481f66e7ef7be3b3a9573f6de07b859d4652364b43ca"} Feb 28 03:48:59 crc kubenswrapper[4819]: I0228 03:48:59.979859 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:48:59 crc kubenswrapper[4819]: I0228 03:48:59.983135 4819 generic.go:334] "Generic (PLEG): container finished" podID="24de1316-bb5e-4366-a897-7d4dc6349b1f" containerID="bacf1853249cd4ce256569f4702fcac114d032a2779ff0d21a5a253db3cd7d21" exitCode=0 Feb 28 03:48:59 crc kubenswrapper[4819]: I0228 03:48:59.983171 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerDied","Data":"bacf1853249cd4ce256569f4702fcac114d032a2779ff0d21a5a253db3cd7d21"} Feb 28 03:49:00 crc kubenswrapper[4819]: I0228 03:49:00.009866 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" podStartSLOduration=2.675223997 podStartE2EDuration="10.00984521s" podCreationTimestamp="2026-02-28 03:48:50 +0000 UTC" firstStartedPulling="2026-02-28 03:48:51.64653088 +0000 UTC m=+870.112099748" lastFinishedPulling="2026-02-28 03:48:58.981152093 +0000 UTC m=+877.446720961" observedRunningTime="2026-02-28 03:49:00.005148744 +0000 UTC m=+878.470717632" watchObservedRunningTime="2026-02-28 03:49:00.00984521 +0000 UTC m=+878.475414068" Feb 28 03:49:00 crc kubenswrapper[4819]: I0228 03:49:00.834407 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:49:00 crc kubenswrapper[4819]: I0228 03:49:00.834693 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:49:00 crc kubenswrapper[4819]: I0228 03:49:00.991190 4819 generic.go:334] "Generic (PLEG): container finished" podID="24de1316-bb5e-4366-a897-7d4dc6349b1f" containerID="fd22caddee79fa18aecda242d717254bb4a2150516d7aaa8d00bc4a49c240933" exitCode=0 Feb 28 03:49:00 crc kubenswrapper[4819]: I0228 03:49:00.991225 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerDied","Data":"fd22caddee79fa18aecda242d717254bb4a2150516d7aaa8d00bc4a49c240933"} Feb 28 03:49:02 crc kubenswrapper[4819]: I0228 03:49:02.004202 4819 generic.go:334] "Generic (PLEG): container finished" podID="24de1316-bb5e-4366-a897-7d4dc6349b1f" containerID="516f940209e7b943965f803b237577b10f177810fc47e552ca079a2ffe3eefd1" exitCode=0 Feb 28 03:49:02 crc kubenswrapper[4819]: I0228 03:49:02.004435 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerDied","Data":"516f940209e7b943965f803b237577b10f177810fc47e552ca079a2ffe3eefd1"} Feb 28 03:49:02 crc kubenswrapper[4819]: I0228 03:49:02.191038 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zt2hg" Feb 28 03:49:03 crc kubenswrapper[4819]: I0228 03:49:03.019537 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"b621a1a3777dda035114f3d56a32b671956de49835fed86ae9661b7fea490f4e"} Feb 28 03:49:03 crc kubenswrapper[4819]: I0228 03:49:03.019930 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"7d4aa74ae3e030f44cee5db92bcb93c869277e135e485c5183a699f1cfcfc17d"} Feb 28 03:49:03 crc kubenswrapper[4819]: I0228 03:49:03.019947 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"43232ac94dcaa1d8e107e704e865e82d1ca46da5c3e79bdaa411c7fc9c7584db"} Feb 28 03:49:03 crc kubenswrapper[4819]: I0228 03:49:03.019961 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"9b4d32f887533730a45beab1fd4ef30672265421b973b1c635c06055de2dc0b3"} Feb 28 03:49:03 crc kubenswrapper[4819]: I0228 03:49:03.019974 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"7b5043b446b7e80d598d6cc68e88db7e8e536145ea291ac2cae52a8b895aa293"} Feb 28 03:49:04 crc kubenswrapper[4819]: I0228 03:49:04.032929 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n8zsj" event={"ID":"24de1316-bb5e-4366-a897-7d4dc6349b1f","Type":"ContainerStarted","Data":"b4574965fd15704af5ab5f297f2748bfb8c9adfd2f622e556682434ed74166ee"} Feb 28 03:49:04 crc kubenswrapper[4819]: I0228 03:49:04.033436 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:49:04 crc kubenswrapper[4819]: I0228 03:49:04.089203 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-n8zsj" podStartSLOduration=6.516358932 podStartE2EDuration="14.089172564s" podCreationTimestamp="2026-02-28 03:48:50 +0000 UTC" firstStartedPulling="2026-02-28 03:48:51.36487821 +0000 UTC m=+869.830447068" lastFinishedPulling="2026-02-28 03:48:58.937691802 +0000 UTC m=+877.403260700" observedRunningTime="2026-02-28 03:49:04.083294139 +0000 UTC m=+882.548863077" watchObservedRunningTime="2026-02-28 03:49:04.089172564 +0000 UTC m=+882.554741452" Feb 28 03:49:06 crc kubenswrapper[4819]: I0228 03:49:06.196538 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:49:06 crc kubenswrapper[4819]: I0228 03:49:06.260389 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.859681 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-vlw7b"] Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.860530 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.864306 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-fsrrj" Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.864316 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.864351 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.875558 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-vlw7b"] Feb 28 03:49:07 crc kubenswrapper[4819]: I0228 03:49:07.987936 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzvnl\" (UniqueName: \"kubernetes.io/projected/790eff89-c9e4-47ac-8b81-c7276fbdb085-kube-api-access-vzvnl\") pod \"mariadb-operator-index-vlw7b\" (UID: \"790eff89-c9e4-47ac-8b81-c7276fbdb085\") " pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:08 crc kubenswrapper[4819]: I0228 03:49:08.089392 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzvnl\" (UniqueName: \"kubernetes.io/projected/790eff89-c9e4-47ac-8b81-c7276fbdb085-kube-api-access-vzvnl\") pod \"mariadb-operator-index-vlw7b\" (UID: \"790eff89-c9e4-47ac-8b81-c7276fbdb085\") " pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:08 crc kubenswrapper[4819]: I0228 03:49:08.111283 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzvnl\" (UniqueName: \"kubernetes.io/projected/790eff89-c9e4-47ac-8b81-c7276fbdb085-kube-api-access-vzvnl\") pod \"mariadb-operator-index-vlw7b\" (UID: \"790eff89-c9e4-47ac-8b81-c7276fbdb085\") " pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:08 crc kubenswrapper[4819]: I0228 03:49:08.187465 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:08 crc kubenswrapper[4819]: I0228 03:49:08.482107 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-vlw7b"] Feb 28 03:49:08 crc kubenswrapper[4819]: W0228 03:49:08.496560 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790eff89_c9e4_47ac_8b81_c7276fbdb085.slice/crio-74506db97c1c15ccc13dce31c74b3e30074932e3f9a270260fb6b42880b3e80b WatchSource:0}: Error finding container 74506db97c1c15ccc13dce31c74b3e30074932e3f9a270260fb6b42880b3e80b: Status 404 returned error can't find the container with id 74506db97c1c15ccc13dce31c74b3e30074932e3f9a270260fb6b42880b3e80b Feb 28 03:49:08 crc kubenswrapper[4819]: I0228 03:49:08.502760 4819 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 03:49:09 crc kubenswrapper[4819]: I0228 03:49:09.081692 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vlw7b" event={"ID":"790eff89-c9e4-47ac-8b81-c7276fbdb085","Type":"ContainerStarted","Data":"74506db97c1c15ccc13dce31c74b3e30074932e3f9a270260fb6b42880b3e80b"} Feb 28 03:49:10 crc kubenswrapper[4819]: I0228 03:49:10.090380 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vlw7b" event={"ID":"790eff89-c9e4-47ac-8b81-c7276fbdb085","Type":"ContainerStarted","Data":"6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5"} Feb 28 03:49:10 crc kubenswrapper[4819]: I0228 03:49:10.119976 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-vlw7b" podStartSLOduration=2.414258562 podStartE2EDuration="3.119952049s" podCreationTimestamp="2026-02-28 03:49:07 +0000 UTC" firstStartedPulling="2026-02-28 03:49:08.502505447 +0000 UTC m=+886.968074305" lastFinishedPulling="2026-02-28 03:49:09.208198914 +0000 UTC m=+887.673767792" observedRunningTime="2026-02-28 03:49:10.115022808 +0000 UTC m=+888.580591726" watchObservedRunningTime="2026-02-28 03:49:10.119952049 +0000 UTC m=+888.585520957" Feb 28 03:49:10 crc kubenswrapper[4819]: I0228 03:49:10.707572 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-sgqvl" Feb 28 03:49:11 crc kubenswrapper[4819]: I0228 03:49:11.215390 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-xwnrs" Feb 28 03:49:18 crc kubenswrapper[4819]: I0228 03:49:18.188485 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:18 crc kubenswrapper[4819]: I0228 03:49:18.189137 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:18 crc kubenswrapper[4819]: I0228 03:49:18.240424 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:19 crc kubenswrapper[4819]: I0228 03:49:19.213044 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:49:21 crc kubenswrapper[4819]: I0228 03:49:21.201057 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-n8zsj" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.544684 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f"] Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.546454 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.549892 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gzck7" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.555735 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f"] Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.624594 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-kube-api-access-xlqg7\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.624668 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-bundle\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.624717 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-util\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.726224 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-util\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.726354 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-kube-api-access-xlqg7\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.726396 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-bundle\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.726940 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-bundle\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.727023 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-util\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.747926 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-kube-api-access-xlqg7\") pod \"210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:24 crc kubenswrapper[4819]: I0228 03:49:24.862459 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:25 crc kubenswrapper[4819]: I0228 03:49:25.134209 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f"] Feb 28 03:49:25 crc kubenswrapper[4819]: I0228 03:49:25.210479 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" event={"ID":"f0ae2373-5f2b-4f8e-8fc7-5d440483017d","Type":"ContainerStarted","Data":"9af4cfd39bbe386aacebbf06a3d770dc1fa697e1430e3dede9154ee2d4662921"} Feb 28 03:49:26 crc kubenswrapper[4819]: I0228 03:49:26.219133 4819 generic.go:334] "Generic (PLEG): container finished" podID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerID="0ef72527af940da10e8c69b1cc4add2231f6627f2cc030c5dbcafb50c003d46b" exitCode=0 Feb 28 03:49:26 crc kubenswrapper[4819]: I0228 03:49:26.219173 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" event={"ID":"f0ae2373-5f2b-4f8e-8fc7-5d440483017d","Type":"ContainerDied","Data":"0ef72527af940da10e8c69b1cc4add2231f6627f2cc030c5dbcafb50c003d46b"} Feb 28 03:49:27 crc kubenswrapper[4819]: I0228 03:49:27.228781 4819 generic.go:334] "Generic (PLEG): container finished" podID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerID="77b34b3263acbaee8b2d070e595c53bed9567865efe5f9db59c0974776600bc9" exitCode=0 Feb 28 03:49:27 crc kubenswrapper[4819]: I0228 03:49:27.228843 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" event={"ID":"f0ae2373-5f2b-4f8e-8fc7-5d440483017d","Type":"ContainerDied","Data":"77b34b3263acbaee8b2d070e595c53bed9567865efe5f9db59c0974776600bc9"} Feb 28 03:49:28 crc kubenswrapper[4819]: I0228 03:49:28.239543 4819 generic.go:334] "Generic (PLEG): container finished" podID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerID="4898dfae76fcdf0b68cdc60b454d69c64c8090aea908fb497271284d4811cd23" exitCode=0 Feb 28 03:49:28 crc kubenswrapper[4819]: I0228 03:49:28.239609 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" event={"ID":"f0ae2373-5f2b-4f8e-8fc7-5d440483017d","Type":"ContainerDied","Data":"4898dfae76fcdf0b68cdc60b454d69c64c8090aea908fb497271284d4811cd23"} Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.546681 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.611929 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-kube-api-access-xlqg7\") pod \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.612028 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-util\") pod \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.612090 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-bundle\") pod \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\" (UID: \"f0ae2373-5f2b-4f8e-8fc7-5d440483017d\") " Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.613654 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-bundle" (OuterVolumeSpecName: "bundle") pod "f0ae2373-5f2b-4f8e-8fc7-5d440483017d" (UID: "f0ae2373-5f2b-4f8e-8fc7-5d440483017d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.617463 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-kube-api-access-xlqg7" (OuterVolumeSpecName: "kube-api-access-xlqg7") pod "f0ae2373-5f2b-4f8e-8fc7-5d440483017d" (UID: "f0ae2373-5f2b-4f8e-8fc7-5d440483017d"). InnerVolumeSpecName "kube-api-access-xlqg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.640635 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-util" (OuterVolumeSpecName: "util") pod "f0ae2373-5f2b-4f8e-8fc7-5d440483017d" (UID: "f0ae2373-5f2b-4f8e-8fc7-5d440483017d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.714180 4819 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-util\") on node \"crc\" DevicePath \"\"" Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.714233 4819 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:49:29 crc kubenswrapper[4819]: I0228 03:49:29.714276 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/f0ae2373-5f2b-4f8e-8fc7-5d440483017d-kube-api-access-xlqg7\") on node \"crc\" DevicePath \"\"" Feb 28 03:49:30 crc kubenswrapper[4819]: I0228 03:49:30.254296 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" event={"ID":"f0ae2373-5f2b-4f8e-8fc7-5d440483017d","Type":"ContainerDied","Data":"9af4cfd39bbe386aacebbf06a3d770dc1fa697e1430e3dede9154ee2d4662921"} Feb 28 03:49:30 crc kubenswrapper[4819]: I0228 03:49:30.254337 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af4cfd39bbe386aacebbf06a3d770dc1fa697e1430e3dede9154ee2d4662921" Feb 28 03:49:30 crc kubenswrapper[4819]: I0228 03:49:30.254361 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f" Feb 28 03:49:30 crc kubenswrapper[4819]: I0228 03:49:30.834169 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:49:30 crc kubenswrapper[4819]: I0228 03:49:30.834781 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.362736 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf"] Feb 28 03:49:38 crc kubenswrapper[4819]: E0228 03:49:38.364415 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="pull" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.364563 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="pull" Feb 28 03:49:38 crc kubenswrapper[4819]: E0228 03:49:38.365337 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="util" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.365350 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="util" Feb 28 03:49:38 crc kubenswrapper[4819]: E0228 03:49:38.365386 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="extract" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.365394 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="extract" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.365585 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" containerName="extract" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.366141 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.368984 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.369871 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-t6zl7" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.370000 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.382401 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf"] Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.443959 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-webhook-cert\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.444639 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-apiservice-cert\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.444702 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvchz\" (UniqueName: \"kubernetes.io/projected/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-kube-api-access-lvchz\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.546602 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-webhook-cert\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.548017 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-apiservice-cert\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.548057 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvchz\" (UniqueName: \"kubernetes.io/projected/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-kube-api-access-lvchz\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.556734 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-apiservice-cert\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.571844 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-webhook-cert\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.575138 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvchz\" (UniqueName: \"kubernetes.io/projected/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-kube-api-access-lvchz\") pod \"mariadb-operator-controller-manager-69797db6c9-792tf\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.694943 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:38 crc kubenswrapper[4819]: I0228 03:49:38.902474 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf"] Feb 28 03:49:38 crc kubenswrapper[4819]: W0228 03:49:38.910858 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c1220fd_a97c_40ec_8c9b_85fa123a3c61.slice/crio-9cba7400e2e300df308c0b87f775c1e9583eb30c42803ebfaca5b2f4b5c0ce7f WatchSource:0}: Error finding container 9cba7400e2e300df308c0b87f775c1e9583eb30c42803ebfaca5b2f4b5c0ce7f: Status 404 returned error can't find the container with id 9cba7400e2e300df308c0b87f775c1e9583eb30c42803ebfaca5b2f4b5c0ce7f Feb 28 03:49:39 crc kubenswrapper[4819]: I0228 03:49:39.321326 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" event={"ID":"1c1220fd-a97c-40ec-8c9b-85fa123a3c61","Type":"ContainerStarted","Data":"9cba7400e2e300df308c0b87f775c1e9583eb30c42803ebfaca5b2f4b5c0ce7f"} Feb 28 03:49:43 crc kubenswrapper[4819]: I0228 03:49:43.352853 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" event={"ID":"1c1220fd-a97c-40ec-8c9b-85fa123a3c61","Type":"ContainerStarted","Data":"0700b20e8409cfdecd94e9ae35aa169a5b97fb34d3f8bf4446257dbcbad3faa9"} Feb 28 03:49:43 crc kubenswrapper[4819]: I0228 03:49:43.353469 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:43 crc kubenswrapper[4819]: I0228 03:49:43.385822 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" podStartSLOduration=1.366264178 podStartE2EDuration="5.385774989s" podCreationTimestamp="2026-02-28 03:49:38 +0000 UTC" firstStartedPulling="2026-02-28 03:49:38.914525419 +0000 UTC m=+917.380094277" lastFinishedPulling="2026-02-28 03:49:42.93403623 +0000 UTC m=+921.399605088" observedRunningTime="2026-02-28 03:49:43.376909079 +0000 UTC m=+921.842478007" watchObservedRunningTime="2026-02-28 03:49:43.385774989 +0000 UTC m=+921.851343887" Feb 28 03:49:48 crc kubenswrapper[4819]: I0228 03:49:48.701955 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.476928 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-bvs5z"] Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.478379 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.480697 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-mb548" Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.557686 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4vd\" (UniqueName: \"kubernetes.io/projected/a99c2560-7558-4825-b0f3-e00e94065eb3-kube-api-access-dw4vd\") pod \"infra-operator-index-bvs5z\" (UID: \"a99c2560-7558-4825-b0f3-e00e94065eb3\") " pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.568330 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-bvs5z"] Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.659133 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4vd\" (UniqueName: \"kubernetes.io/projected/a99c2560-7558-4825-b0f3-e00e94065eb3-kube-api-access-dw4vd\") pod \"infra-operator-index-bvs5z\" (UID: \"a99c2560-7558-4825-b0f3-e00e94065eb3\") " pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.685083 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4vd\" (UniqueName: \"kubernetes.io/projected/a99c2560-7558-4825-b0f3-e00e94065eb3-kube-api-access-dw4vd\") pod \"infra-operator-index-bvs5z\" (UID: \"a99c2560-7558-4825-b0f3-e00e94065eb3\") " pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:52 crc kubenswrapper[4819]: I0228 03:49:52.820444 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:53 crc kubenswrapper[4819]: I0228 03:49:53.120841 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-bvs5z"] Feb 28 03:49:53 crc kubenswrapper[4819]: W0228 03:49:53.124540 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99c2560_7558_4825_b0f3_e00e94065eb3.slice/crio-d4407a28ccd03aa4447cb0401b61720bb2887bb2f44a88902266dc1d10c60b89 WatchSource:0}: Error finding container d4407a28ccd03aa4447cb0401b61720bb2887bb2f44a88902266dc1d10c60b89: Status 404 returned error can't find the container with id d4407a28ccd03aa4447cb0401b61720bb2887bb2f44a88902266dc1d10c60b89 Feb 28 03:49:53 crc kubenswrapper[4819]: I0228 03:49:53.432094 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bvs5z" event={"ID":"a99c2560-7558-4825-b0f3-e00e94065eb3","Type":"ContainerStarted","Data":"d4407a28ccd03aa4447cb0401b61720bb2887bb2f44a88902266dc1d10c60b89"} Feb 28 03:49:54 crc kubenswrapper[4819]: I0228 03:49:54.442478 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bvs5z" event={"ID":"a99c2560-7558-4825-b0f3-e00e94065eb3","Type":"ContainerStarted","Data":"f24f007b33bb820819c97bc804a401e45941d830f53f6dde0ac5748d0a803248"} Feb 28 03:49:54 crc kubenswrapper[4819]: I0228 03:49:54.470652 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-bvs5z" podStartSLOduration=1.782229236 podStartE2EDuration="2.470616932s" podCreationTimestamp="2026-02-28 03:49:52 +0000 UTC" firstStartedPulling="2026-02-28 03:49:53.126948253 +0000 UTC m=+931.592517101" lastFinishedPulling="2026-02-28 03:49:53.815335939 +0000 UTC m=+932.280904797" observedRunningTime="2026-02-28 03:49:54.462010369 +0000 UTC m=+932.927579257" watchObservedRunningTime="2026-02-28 03:49:54.470616932 +0000 UTC m=+932.936185820" Feb 28 03:49:56 crc kubenswrapper[4819]: I0228 03:49:56.675571 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-bvs5z"] Feb 28 03:49:56 crc kubenswrapper[4819]: I0228 03:49:56.675889 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-bvs5z" podUID="a99c2560-7558-4825-b0f3-e00e94065eb3" containerName="registry-server" containerID="cri-o://f24f007b33bb820819c97bc804a401e45941d830f53f6dde0ac5748d0a803248" gracePeriod=2 Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.287734 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-wpbg5"] Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.289630 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.299716 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-wpbg5"] Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.437061 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6swdx\" (UniqueName: \"kubernetes.io/projected/1d272d93-35f0-4ece-b028-e72a1b0d7b6b-kube-api-access-6swdx\") pod \"infra-operator-index-wpbg5\" (UID: \"1d272d93-35f0-4ece-b028-e72a1b0d7b6b\") " pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.494774 4819 generic.go:334] "Generic (PLEG): container finished" podID="a99c2560-7558-4825-b0f3-e00e94065eb3" containerID="f24f007b33bb820819c97bc804a401e45941d830f53f6dde0ac5748d0a803248" exitCode=0 Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.494828 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bvs5z" event={"ID":"a99c2560-7558-4825-b0f3-e00e94065eb3","Type":"ContainerDied","Data":"f24f007b33bb820819c97bc804a401e45941d830f53f6dde0ac5748d0a803248"} Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.538888 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6swdx\" (UniqueName: \"kubernetes.io/projected/1d272d93-35f0-4ece-b028-e72a1b0d7b6b-kube-api-access-6swdx\") pod \"infra-operator-index-wpbg5\" (UID: \"1d272d93-35f0-4ece-b028-e72a1b0d7b6b\") " pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.561547 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6swdx\" (UniqueName: \"kubernetes.io/projected/1d272d93-35f0-4ece-b028-e72a1b0d7b6b-kube-api-access-6swdx\") pod \"infra-operator-index-wpbg5\" (UID: \"1d272d93-35f0-4ece-b028-e72a1b0d7b6b\") " pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.631049 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.655472 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.741052 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw4vd\" (UniqueName: \"kubernetes.io/projected/a99c2560-7558-4825-b0f3-e00e94065eb3-kube-api-access-dw4vd\") pod \"a99c2560-7558-4825-b0f3-e00e94065eb3\" (UID: \"a99c2560-7558-4825-b0f3-e00e94065eb3\") " Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.745407 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99c2560-7558-4825-b0f3-e00e94065eb3-kube-api-access-dw4vd" (OuterVolumeSpecName: "kube-api-access-dw4vd") pod "a99c2560-7558-4825-b0f3-e00e94065eb3" (UID: "a99c2560-7558-4825-b0f3-e00e94065eb3"). InnerVolumeSpecName "kube-api-access-dw4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.843229 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw4vd\" (UniqueName: \"kubernetes.io/projected/a99c2560-7558-4825-b0f3-e00e94065eb3-kube-api-access-dw4vd\") on node \"crc\" DevicePath \"\"" Feb 28 03:49:57 crc kubenswrapper[4819]: I0228 03:49:57.856101 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-wpbg5"] Feb 28 03:49:57 crc kubenswrapper[4819]: W0228 03:49:57.861055 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d272d93_35f0_4ece_b028_e72a1b0d7b6b.slice/crio-4b00cd4166bd676615c661715e40b3c478ea3d7fe88459d49d88cade5d266e0c WatchSource:0}: Error finding container 4b00cd4166bd676615c661715e40b3c478ea3d7fe88459d49d88cade5d266e0c: Status 404 returned error can't find the container with id 4b00cd4166bd676615c661715e40b3c478ea3d7fe88459d49d88cade5d266e0c Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.519003 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wpbg5" event={"ID":"1d272d93-35f0-4ece-b028-e72a1b0d7b6b","Type":"ContainerStarted","Data":"d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca"} Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.519489 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wpbg5" event={"ID":"1d272d93-35f0-4ece-b028-e72a1b0d7b6b","Type":"ContainerStarted","Data":"4b00cd4166bd676615c661715e40b3c478ea3d7fe88459d49d88cade5d266e0c"} Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.521201 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bvs5z" event={"ID":"a99c2560-7558-4825-b0f3-e00e94065eb3","Type":"ContainerDied","Data":"d4407a28ccd03aa4447cb0401b61720bb2887bb2f44a88902266dc1d10c60b89"} Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.521275 4819 scope.go:117] "RemoveContainer" containerID="f24f007b33bb820819c97bc804a401e45941d830f53f6dde0ac5748d0a803248" Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.521437 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bvs5z" Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.546110 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-wpbg5" podStartSLOduration=1.1259359899999999 podStartE2EDuration="1.546091754s" podCreationTimestamp="2026-02-28 03:49:57 +0000 UTC" firstStartedPulling="2026-02-28 03:49:57.866524497 +0000 UTC m=+936.332093355" lastFinishedPulling="2026-02-28 03:49:58.286680251 +0000 UTC m=+936.752249119" observedRunningTime="2026-02-28 03:49:58.538443504 +0000 UTC m=+937.004012402" watchObservedRunningTime="2026-02-28 03:49:58.546091754 +0000 UTC m=+937.011660602" Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.559391 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-bvs5z"] Feb 28 03:49:58 crc kubenswrapper[4819]: I0228 03:49:58.564773 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-bvs5z"] Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.125495 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537510-8mv82"] Feb 28 03:50:00 crc kubenswrapper[4819]: E0228 03:50:00.126029 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99c2560-7558-4825-b0f3-e00e94065eb3" containerName="registry-server" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.126041 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99c2560-7558-4825-b0f3-e00e94065eb3" containerName="registry-server" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.126145 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99c2560-7558-4825-b0f3-e00e94065eb3" containerName="registry-server" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.126534 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.129052 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.129108 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.131113 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.143325 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537510-8mv82"] Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.276740 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rwn\" (UniqueName: \"kubernetes.io/projected/dcd5b8da-92d4-4701-8aac-c77e83d76567-kube-api-access-p2rwn\") pod \"auto-csr-approver-29537510-8mv82\" (UID: \"dcd5b8da-92d4-4701-8aac-c77e83d76567\") " pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.375879 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99c2560-7558-4825-b0f3-e00e94065eb3" path="/var/lib/kubelet/pods/a99c2560-7558-4825-b0f3-e00e94065eb3/volumes" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.378087 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rwn\" (UniqueName: \"kubernetes.io/projected/dcd5b8da-92d4-4701-8aac-c77e83d76567-kube-api-access-p2rwn\") pod \"auto-csr-approver-29537510-8mv82\" (UID: \"dcd5b8da-92d4-4701-8aac-c77e83d76567\") " pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.405374 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rwn\" (UniqueName: \"kubernetes.io/projected/dcd5b8da-92d4-4701-8aac-c77e83d76567-kube-api-access-p2rwn\") pod \"auto-csr-approver-29537510-8mv82\" (UID: \"dcd5b8da-92d4-4701-8aac-c77e83d76567\") " pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.448362 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.833731 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.833800 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.833844 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.834761 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ab93bb2251f8a9fb9c9db9bc6189036f7bfbd545e0f1b6f246a96c7b8188206"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.834873 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://5ab93bb2251f8a9fb9c9db9bc6189036f7bfbd545e0f1b6f246a96c7b8188206" gracePeriod=600 Feb 28 03:50:00 crc kubenswrapper[4819]: I0228 03:50:00.907789 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537510-8mv82"] Feb 28 03:50:00 crc kubenswrapper[4819]: W0228 03:50:00.915788 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd5b8da_92d4_4701_8aac_c77e83d76567.slice/crio-54f91b9fcfdca6dd4ef263a78aaa9e33cde0505a07e05fab6808dbf0633ff210 WatchSource:0}: Error finding container 54f91b9fcfdca6dd4ef263a78aaa9e33cde0505a07e05fab6808dbf0633ff210: Status 404 returned error can't find the container with id 54f91b9fcfdca6dd4ef263a78aaa9e33cde0505a07e05fab6808dbf0633ff210 Feb 28 03:50:01 crc kubenswrapper[4819]: I0228 03:50:01.552202 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537510-8mv82" event={"ID":"dcd5b8da-92d4-4701-8aac-c77e83d76567","Type":"ContainerStarted","Data":"54f91b9fcfdca6dd4ef263a78aaa9e33cde0505a07e05fab6808dbf0633ff210"} Feb 28 03:50:01 crc kubenswrapper[4819]: I0228 03:50:01.555584 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="5ab93bb2251f8a9fb9c9db9bc6189036f7bfbd545e0f1b6f246a96c7b8188206" exitCode=0 Feb 28 03:50:01 crc kubenswrapper[4819]: I0228 03:50:01.555638 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"5ab93bb2251f8a9fb9c9db9bc6189036f7bfbd545e0f1b6f246a96c7b8188206"} Feb 28 03:50:01 crc kubenswrapper[4819]: I0228 03:50:01.555689 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"2be5a3de849f4caa81a3f6eb2371d580108119159dd0203e877d29c0441c1708"} Feb 28 03:50:01 crc kubenswrapper[4819]: I0228 03:50:01.555711 4819 scope.go:117] "RemoveContainer" containerID="5ade958a408b150f7a5061d92ed6be3f2480394555f495dbd9814681a29f7247" Feb 28 03:50:03 crc kubenswrapper[4819]: I0228 03:50:03.572294 4819 generic.go:334] "Generic (PLEG): container finished" podID="dcd5b8da-92d4-4701-8aac-c77e83d76567" containerID="71f4ab953d963d5206b80a3b933886691381ae4197e36557863800ff386245c3" exitCode=0 Feb 28 03:50:03 crc kubenswrapper[4819]: I0228 03:50:03.572373 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537510-8mv82" event={"ID":"dcd5b8da-92d4-4701-8aac-c77e83d76567","Type":"ContainerDied","Data":"71f4ab953d963d5206b80a3b933886691381ae4197e36557863800ff386245c3"} Feb 28 03:50:04 crc kubenswrapper[4819]: I0228 03:50:04.936776 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:05 crc kubenswrapper[4819]: I0228 03:50:05.048643 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2rwn\" (UniqueName: \"kubernetes.io/projected/dcd5b8da-92d4-4701-8aac-c77e83d76567-kube-api-access-p2rwn\") pod \"dcd5b8da-92d4-4701-8aac-c77e83d76567\" (UID: \"dcd5b8da-92d4-4701-8aac-c77e83d76567\") " Feb 28 03:50:05 crc kubenswrapper[4819]: I0228 03:50:05.072174 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd5b8da-92d4-4701-8aac-c77e83d76567-kube-api-access-p2rwn" (OuterVolumeSpecName: "kube-api-access-p2rwn") pod "dcd5b8da-92d4-4701-8aac-c77e83d76567" (UID: "dcd5b8da-92d4-4701-8aac-c77e83d76567"). InnerVolumeSpecName "kube-api-access-p2rwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:50:05 crc kubenswrapper[4819]: I0228 03:50:05.150428 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2rwn\" (UniqueName: \"kubernetes.io/projected/dcd5b8da-92d4-4701-8aac-c77e83d76567-kube-api-access-p2rwn\") on node \"crc\" DevicePath \"\"" Feb 28 03:50:05 crc kubenswrapper[4819]: I0228 03:50:05.590865 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537510-8mv82" event={"ID":"dcd5b8da-92d4-4701-8aac-c77e83d76567","Type":"ContainerDied","Data":"54f91b9fcfdca6dd4ef263a78aaa9e33cde0505a07e05fab6808dbf0633ff210"} Feb 28 03:50:05 crc kubenswrapper[4819]: I0228 03:50:05.590907 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f91b9fcfdca6dd4ef263a78aaa9e33cde0505a07e05fab6808dbf0633ff210" Feb 28 03:50:05 crc kubenswrapper[4819]: I0228 03:50:05.590943 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537510-8mv82" Feb 28 03:50:06 crc kubenswrapper[4819]: I0228 03:50:06.012530 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537504-dvppb"] Feb 28 03:50:06 crc kubenswrapper[4819]: I0228 03:50:06.019714 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537504-dvppb"] Feb 28 03:50:06 crc kubenswrapper[4819]: I0228 03:50:06.376675 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e77f7fb-8d87-402f-8a5d-28e75ff27418" path="/var/lib/kubelet/pods/4e77f7fb-8d87-402f-8a5d-28e75ff27418/volumes" Feb 28 03:50:07 crc kubenswrapper[4819]: I0228 03:50:07.656359 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:50:07 crc kubenswrapper[4819]: I0228 03:50:07.656482 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:50:07 crc kubenswrapper[4819]: I0228 03:50:07.727277 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:50:08 crc kubenswrapper[4819]: I0228 03:50:08.667160 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.340336 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48"] Feb 28 03:50:17 crc kubenswrapper[4819]: E0228 03:50:17.341579 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd5b8da-92d4-4701-8aac-c77e83d76567" containerName="oc" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.341609 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd5b8da-92d4-4701-8aac-c77e83d76567" containerName="oc" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.341843 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd5b8da-92d4-4701-8aac-c77e83d76567" containerName="oc" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.343210 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.346315 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gzck7" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.364613 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48"] Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.440700 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-bundle\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.440805 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-util\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.440888 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxcw\" (UniqueName: \"kubernetes.io/projected/e1f3677a-e092-464a-925c-c0792c7590da-kube-api-access-gdxcw\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.542561 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxcw\" (UniqueName: \"kubernetes.io/projected/e1f3677a-e092-464a-925c-c0792c7590da-kube-api-access-gdxcw\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.542737 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-bundle\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.542853 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-util\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.543469 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-util\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.544223 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-bundle\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.575053 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxcw\" (UniqueName: \"kubernetes.io/projected/e1f3677a-e092-464a-925c-c0792c7590da-kube-api-access-gdxcw\") pod \"79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.688233 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:17 crc kubenswrapper[4819]: I0228 03:50:17.972425 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48"] Feb 28 03:50:17 crc kubenswrapper[4819]: W0228 03:50:17.980119 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8 WatchSource:0}: Error finding container 22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8: Status 404 returned error can't find the container with id 22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8 Feb 28 03:50:18 crc kubenswrapper[4819]: I0228 03:50:18.691977 4819 generic.go:334] "Generic (PLEG): container finished" podID="e1f3677a-e092-464a-925c-c0792c7590da" containerID="aba080e6484c8527072a5e8957dd2736f9d3132d53bf1440cf98659597c79d0f" exitCode=0 Feb 28 03:50:18 crc kubenswrapper[4819]: I0228 03:50:18.692323 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" event={"ID":"e1f3677a-e092-464a-925c-c0792c7590da","Type":"ContainerDied","Data":"aba080e6484c8527072a5e8957dd2736f9d3132d53bf1440cf98659597c79d0f"} Feb 28 03:50:18 crc kubenswrapper[4819]: I0228 03:50:18.692377 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" event={"ID":"e1f3677a-e092-464a-925c-c0792c7590da","Type":"ContainerStarted","Data":"22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8"} Feb 28 03:50:19 crc kubenswrapper[4819]: I0228 03:50:19.701744 4819 generic.go:334] "Generic (PLEG): container finished" podID="e1f3677a-e092-464a-925c-c0792c7590da" containerID="c0d50663bbe2d666c36a9da32faba86cc97b14ad031d079b7ce21c3c84ad2ea7" exitCode=0 Feb 28 03:50:19 crc kubenswrapper[4819]: I0228 03:50:19.701808 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" event={"ID":"e1f3677a-e092-464a-925c-c0792c7590da","Type":"ContainerDied","Data":"c0d50663bbe2d666c36a9da32faba86cc97b14ad031d079b7ce21c3c84ad2ea7"} Feb 28 03:50:20 crc kubenswrapper[4819]: I0228 03:50:20.713456 4819 generic.go:334] "Generic (PLEG): container finished" podID="e1f3677a-e092-464a-925c-c0792c7590da" containerID="a07edff885000ba80b026778c2a5052473c759caa180df7643b8fc5aedbe0105" exitCode=0 Feb 28 03:50:20 crc kubenswrapper[4819]: I0228 03:50:20.713531 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" event={"ID":"e1f3677a-e092-464a-925c-c0792c7590da","Type":"ContainerDied","Data":"a07edff885000ba80b026778c2a5052473c759caa180df7643b8fc5aedbe0105"} Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.037108 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.214865 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-bundle\") pod \"e1f3677a-e092-464a-925c-c0792c7590da\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.214947 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-util\") pod \"e1f3677a-e092-464a-925c-c0792c7590da\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.214992 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdxcw\" (UniqueName: \"kubernetes.io/projected/e1f3677a-e092-464a-925c-c0792c7590da-kube-api-access-gdxcw\") pod \"e1f3677a-e092-464a-925c-c0792c7590da\" (UID: \"e1f3677a-e092-464a-925c-c0792c7590da\") " Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.219489 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-bundle" (OuterVolumeSpecName: "bundle") pod "e1f3677a-e092-464a-925c-c0792c7590da" (UID: "e1f3677a-e092-464a-925c-c0792c7590da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.225686 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f3677a-e092-464a-925c-c0792c7590da-kube-api-access-gdxcw" (OuterVolumeSpecName: "kube-api-access-gdxcw") pod "e1f3677a-e092-464a-925c-c0792c7590da" (UID: "e1f3677a-e092-464a-925c-c0792c7590da"). InnerVolumeSpecName "kube-api-access-gdxcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.245322 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-util" (OuterVolumeSpecName: "util") pod "e1f3677a-e092-464a-925c-c0792c7590da" (UID: "e1f3677a-e092-464a-925c-c0792c7590da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.317092 4819 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.317143 4819 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1f3677a-e092-464a-925c-c0792c7590da-util\") on node \"crc\" DevicePath \"\"" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.317159 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdxcw\" (UniqueName: \"kubernetes.io/projected/e1f3677a-e092-464a-925c-c0792c7590da-kube-api-access-gdxcw\") on node \"crc\" DevicePath \"\"" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.733559 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" event={"ID":"e1f3677a-e092-464a-925c-c0792c7590da","Type":"ContainerDied","Data":"22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8"} Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.733639 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8" Feb 28 03:50:22 crc kubenswrapper[4819]: I0228 03:50:22.733659 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48" Feb 28 03:50:24 crc kubenswrapper[4819]: E0228 03:50:24.341431 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8\": RecentStats: unable to find data in memory cache]" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.811565 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-65c67cfc-55thv"] Feb 28 03:50:32 crc kubenswrapper[4819]: E0228 03:50:32.812509 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="util" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.812588 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="util" Feb 28 03:50:32 crc kubenswrapper[4819]: E0228 03:50:32.812599 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="pull" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.812606 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="pull" Feb 28 03:50:32 crc kubenswrapper[4819]: E0228 03:50:32.812620 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="extract" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.812627 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="extract" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.812745 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f3677a-e092-464a-925c-c0792c7590da" containerName="extract" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.813150 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.817789 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.818517 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jr5wr" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.846230 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65c67cfc-55thv"] Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.966205 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpktt\" (UniqueName: \"kubernetes.io/projected/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-kube-api-access-zpktt\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.966317 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-webhook-cert\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:32 crc kubenswrapper[4819]: I0228 03:50:32.966375 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-apiservice-cert\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.067824 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpktt\" (UniqueName: \"kubernetes.io/projected/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-kube-api-access-zpktt\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.068144 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-webhook-cert\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.068364 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-apiservice-cert\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.074685 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-apiservice-cert\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.077388 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-webhook-cert\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.088386 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpktt\" (UniqueName: \"kubernetes.io/projected/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-kube-api-access-zpktt\") pod \"infra-operator-controller-manager-65c67cfc-55thv\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.133673 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.371380 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65c67cfc-55thv"] Feb 28 03:50:33 crc kubenswrapper[4819]: I0228 03:50:33.856109 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" event={"ID":"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e","Type":"ContainerStarted","Data":"df9658a1ba77981e1292a68491eb5e6a1e9e360991f200dbb8d3ba0ec05dd6ae"} Feb 28 03:50:34 crc kubenswrapper[4819]: E0228 03:50:34.476423 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice\": RecentStats: unable to find data in memory cache]" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.810622 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.811562 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.821305 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"galera-openstack-dockercfg-gb29b" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.823672 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-config-data" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.823773 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openshift-service-ca.crt" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.823908 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-scripts" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.824060 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"kube-root-ca.crt" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.834811 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.847397 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.873070 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.880324 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.885506 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.902891 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdkww\" (UniqueName: \"kubernetes.io/projected/81cae302-997a-482b-b76a-90b1172083b1-kube-api-access-tdkww\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.903506 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.903598 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.904946 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.905127 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.905358 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81cae302-997a-482b-b76a-90b1172083b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.961753 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Feb 28 03:50:34 crc kubenswrapper[4819]: I0228 03:50:34.993307 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009002 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-default\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009093 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009117 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009139 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009164 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81cae302-997a-482b-b76a-90b1172083b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009185 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kolla-config\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009212 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdkww\" (UniqueName: \"kubernetes.io/projected/81cae302-997a-482b-b76a-90b1172083b1-kube-api-access-tdkww\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009230 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009263 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009315 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009334 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-kolla-config\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009361 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.009381 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-default\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.011012 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.011320 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81cae302-997a-482b-b76a-90b1172083b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.011784 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012264 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012319 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012371 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cglx8\" (UniqueName: \"kubernetes.io/projected/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kube-api-access-cglx8\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012398 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2clhx\" (UniqueName: \"kubernetes.io/projected/9b16747d-1b6b-44ee-896e-0ead9587deeb-kube-api-access-2clhx\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012418 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-generated\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012433 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-operator-scripts\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.012737 4819 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") device mount path \"/mnt/openstack/pv12\"" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.058288 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdkww\" (UniqueName: \"kubernetes.io/projected/81cae302-997a-482b-b76a-90b1172083b1-kube-api-access-tdkww\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.065472 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113514 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-kolla-config\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113575 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113599 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-default\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113623 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cglx8\" (UniqueName: \"kubernetes.io/projected/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kube-api-access-cglx8\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113644 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2clhx\" (UniqueName: \"kubernetes.io/projected/9b16747d-1b6b-44ee-896e-0ead9587deeb-kube-api-access-2clhx\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113660 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-generated\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113673 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-operator-scripts\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113693 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-default\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113711 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113733 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113756 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kolla-config\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.113780 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.115133 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-operator-scripts\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.115567 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-kolla-config\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.115784 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-generated\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.116354 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-default\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.117549 4819 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") device mount path \"/mnt/openstack/pv03\"" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.117719 4819 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") device mount path \"/mnt/openstack/pv07\"" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.118220 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-default\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.118710 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kolla-config\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.118780 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-operator-scripts\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.122395 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-generated\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.135197 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cglx8\" (UniqueName: \"kubernetes.io/projected/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kube-api-access-cglx8\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.135631 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2clhx\" (UniqueName: \"kubernetes.io/projected/9b16747d-1b6b-44ee-896e-0ead9587deeb-kube-api-access-2clhx\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.136543 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.145326 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.180680 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.200622 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.211824 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.719825 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.880207 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Feb 28 03:50:35 crc kubenswrapper[4819]: W0228 03:50:35.882392 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81cae302_997a_482b_b76a_90b1172083b1.slice/crio-59bdc841d38f540743d985644b78431bf67270a26d2e9239d7053f095d41e60f WatchSource:0}: Error finding container 59bdc841d38f540743d985644b78431bf67270a26d2e9239d7053f095d41e60f: Status 404 returned error can't find the container with id 59bdc841d38f540743d985644b78431bf67270a26d2e9239d7053f095d41e60f Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.966029 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"81cae302-997a-482b-b76a-90b1172083b1","Type":"ContainerStarted","Data":"59bdc841d38f540743d985644b78431bf67270a26d2e9239d7053f095d41e60f"} Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.967228 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"a18d0ed2-f5db-4f32-b635-7956eeee1f01","Type":"ContainerStarted","Data":"d3abc08dd4cb84d0f83276eb36a360e01f33b59023d96e2be34ae56ac34cd48f"} Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.968762 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" event={"ID":"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e","Type":"ContainerStarted","Data":"0bb9b650991cd37cb1b8f87b559e76cbd1ec40205e835fb260cf2999742c3502"} Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.968894 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:35 crc kubenswrapper[4819]: I0228 03:50:35.995894 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" podStartSLOduration=1.911884826 podStartE2EDuration="3.995852969s" podCreationTimestamp="2026-02-28 03:50:32 +0000 UTC" firstStartedPulling="2026-02-28 03:50:33.385455722 +0000 UTC m=+971.851024580" lastFinishedPulling="2026-02-28 03:50:35.469423855 +0000 UTC m=+973.934992723" observedRunningTime="2026-02-28 03:50:35.990583438 +0000 UTC m=+974.456152296" watchObservedRunningTime="2026-02-28 03:50:35.995852969 +0000 UTC m=+974.461421827" Feb 28 03:50:36 crc kubenswrapper[4819]: I0228 03:50:36.006462 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Feb 28 03:50:36 crc kubenswrapper[4819]: W0228 03:50:36.013169 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b16747d_1b6b_44ee_896e_0ead9587deeb.slice/crio-c68709d983c6c4c4a3ac3b30bccfa9121a26c11a1d38b49fb69311681a23f363 WatchSource:0}: Error finding container c68709d983c6c4c4a3ac3b30bccfa9121a26c11a1d38b49fb69311681a23f363: Status 404 returned error can't find the container with id c68709d983c6c4c4a3ac3b30bccfa9121a26c11a1d38b49fb69311681a23f363 Feb 28 03:50:36 crc kubenswrapper[4819]: I0228 03:50:36.978715 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9b16747d-1b6b-44ee-896e-0ead9587deeb","Type":"ContainerStarted","Data":"c68709d983c6c4c4a3ac3b30bccfa9121a26c11a1d38b49fb69311681a23f363"} Feb 28 03:50:43 crc kubenswrapper[4819]: I0228 03:50:43.139821 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:50:44 crc kubenswrapper[4819]: E0228 03:50:44.599403 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice\": RecentStats: unable to find data in memory cache]" Feb 28 03:50:45 crc kubenswrapper[4819]: I0228 03:50:45.543180 4819 scope.go:117] "RemoveContainer" containerID="698393bb4ab5d3e2f2ba104e16efc862fdd69434a764aa0a71e40dbdec4c5f8d" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.149392 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/memcached-0"] Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.150978 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.153611 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"memcached-config-data" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.153942 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"memcached-memcached-dockercfg-tq7w4" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.160538 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.199202 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-kolla-config\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.199289 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmqc8\" (UniqueName: \"kubernetes.io/projected/c684da03-6893-45f7-833c-2e71ad6c7e47-kube-api-access-rmqc8\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.199359 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-config-data\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.300794 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-kolla-config\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.300871 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmqc8\" (UniqueName: \"kubernetes.io/projected/c684da03-6893-45f7-833c-2e71ad6c7e47-kube-api-access-rmqc8\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.300940 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-config-data\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.301726 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-config-data\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.302182 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-kolla-config\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.337889 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmqc8\" (UniqueName: \"kubernetes.io/projected/c684da03-6893-45f7-833c-2e71ad6c7e47-kube-api-access-rmqc8\") pod \"memcached-0\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:46 crc kubenswrapper[4819]: I0228 03:50:46.473531 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.053430 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jdfkd"] Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.054114 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.056597 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-n8zcv" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.067111 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jdfkd"] Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.116940 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr99j\" (UniqueName: \"kubernetes.io/projected/e3da2c06-ff12-4661-9164-bdce48392642-kube-api-access-hr99j\") pod \"rabbitmq-cluster-operator-index-jdfkd\" (UID: \"e3da2c06-ff12-4661-9164-bdce48392642\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.218384 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr99j\" (UniqueName: \"kubernetes.io/projected/e3da2c06-ff12-4661-9164-bdce48392642-kube-api-access-hr99j\") pod \"rabbitmq-cluster-operator-index-jdfkd\" (UID: \"e3da2c06-ff12-4661-9164-bdce48392642\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.240897 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr99j\" (UniqueName: \"kubernetes.io/projected/e3da2c06-ff12-4661-9164-bdce48392642-kube-api-access-hr99j\") pod \"rabbitmq-cluster-operator-index-jdfkd\" (UID: \"e3da2c06-ff12-4661-9164-bdce48392642\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.298872 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.426540 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:47 crc kubenswrapper[4819]: I0228 03:50:47.622195 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jdfkd"] Feb 28 03:50:47 crc kubenswrapper[4819]: W0228 03:50:47.628306 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3da2c06_ff12_4661_9164_bdce48392642.slice/crio-246a893fc44f1fa6a0792c4ded347498ea33b5d1f47cf9490a01034e44ff9bd3 WatchSource:0}: Error finding container 246a893fc44f1fa6a0792c4ded347498ea33b5d1f47cf9490a01034e44ff9bd3: Status 404 returned error can't find the container with id 246a893fc44f1fa6a0792c4ded347498ea33b5d1f47cf9490a01034e44ff9bd3 Feb 28 03:50:48 crc kubenswrapper[4819]: I0228 03:50:48.120288 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"a18d0ed2-f5db-4f32-b635-7956eeee1f01","Type":"ContainerStarted","Data":"5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d"} Feb 28 03:50:48 crc kubenswrapper[4819]: I0228 03:50:48.121904 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" event={"ID":"e3da2c06-ff12-4661-9164-bdce48392642","Type":"ContainerStarted","Data":"246a893fc44f1fa6a0792c4ded347498ea33b5d1f47cf9490a01034e44ff9bd3"} Feb 28 03:50:48 crc kubenswrapper[4819]: I0228 03:50:48.122754 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"c684da03-6893-45f7-833c-2e71ad6c7e47","Type":"ContainerStarted","Data":"2f46cf4ac61759660ed083d4f4e98911b369237af71213522ae18fc1c5e3c2c8"} Feb 28 03:50:48 crc kubenswrapper[4819]: I0228 03:50:48.125873 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"81cae302-997a-482b-b76a-90b1172083b1","Type":"ContainerStarted","Data":"6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d"} Feb 28 03:50:48 crc kubenswrapper[4819]: I0228 03:50:48.130783 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9b16747d-1b6b-44ee-896e-0ead9587deeb","Type":"ContainerStarted","Data":"5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f"} Feb 28 03:50:51 crc kubenswrapper[4819]: I0228 03:50:51.248184 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jdfkd"] Feb 28 03:50:51 crc kubenswrapper[4819]: I0228 03:50:51.860260 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-whnjt"] Feb 28 03:50:51 crc kubenswrapper[4819]: I0228 03:50:51.860948 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:50:51 crc kubenswrapper[4819]: I0228 03:50:51.890986 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7slbz\" (UniqueName: \"kubernetes.io/projected/96706889-ea67-4e43-ab82-6a6026090647-kube-api-access-7slbz\") pod \"rabbitmq-cluster-operator-index-whnjt\" (UID: \"96706889-ea67-4e43-ab82-6a6026090647\") " pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:50:51 crc kubenswrapper[4819]: I0228 03:50:51.915438 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-whnjt"] Feb 28 03:50:51 crc kubenswrapper[4819]: I0228 03:50:51.992416 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7slbz\" (UniqueName: \"kubernetes.io/projected/96706889-ea67-4e43-ab82-6a6026090647-kube-api-access-7slbz\") pod \"rabbitmq-cluster-operator-index-whnjt\" (UID: \"96706889-ea67-4e43-ab82-6a6026090647\") " pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:50:52 crc kubenswrapper[4819]: I0228 03:50:52.013215 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7slbz\" (UniqueName: \"kubernetes.io/projected/96706889-ea67-4e43-ab82-6a6026090647-kube-api-access-7slbz\") pod \"rabbitmq-cluster-operator-index-whnjt\" (UID: \"96706889-ea67-4e43-ab82-6a6026090647\") " pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:50:52 crc kubenswrapper[4819]: I0228 03:50:52.175328 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:50:52 crc kubenswrapper[4819]: I0228 03:50:52.188454 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"c684da03-6893-45f7-833c-2e71ad6c7e47","Type":"ContainerStarted","Data":"9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0"} Feb 28 03:50:52 crc kubenswrapper[4819]: I0228 03:50:52.189056 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/memcached-0" Feb 28 03:50:52 crc kubenswrapper[4819]: I0228 03:50:52.212504 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/memcached-0" podStartSLOduration=2.5532163409999997 podStartE2EDuration="6.212486156s" podCreationTimestamp="2026-02-28 03:50:46 +0000 UTC" firstStartedPulling="2026-02-28 03:50:47.310487919 +0000 UTC m=+985.776056777" lastFinishedPulling="2026-02-28 03:50:50.969757734 +0000 UTC m=+989.435326592" observedRunningTime="2026-02-28 03:50:52.210267391 +0000 UTC m=+990.675836249" watchObservedRunningTime="2026-02-28 03:50:52.212486156 +0000 UTC m=+990.678055014" Feb 28 03:50:53 crc kubenswrapper[4819]: I0228 03:50:53.037707 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-whnjt"] Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.198724 4819 generic.go:334] "Generic (PLEG): container finished" podID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerID="5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f" exitCode=0 Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.198795 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9b16747d-1b6b-44ee-896e-0ead9587deeb","Type":"ContainerDied","Data":"5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f"} Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.201092 4819 generic.go:334] "Generic (PLEG): container finished" podID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerID="5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d" exitCode=0 Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.201173 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"a18d0ed2-f5db-4f32-b635-7956eeee1f01","Type":"ContainerDied","Data":"5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d"} Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.202843 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" event={"ID":"e3da2c06-ff12-4661-9164-bdce48392642","Type":"ContainerStarted","Data":"c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e"} Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.202950 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" podUID="e3da2c06-ff12-4661-9164-bdce48392642" containerName="registry-server" containerID="cri-o://c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e" gracePeriod=2 Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.204411 4819 generic.go:334] "Generic (PLEG): container finished" podID="81cae302-997a-482b-b76a-90b1172083b1" containerID="6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d" exitCode=0 Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.204510 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"81cae302-997a-482b-b76a-90b1172083b1","Type":"ContainerDied","Data":"6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d"} Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.205728 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" event={"ID":"96706889-ea67-4e43-ab82-6a6026090647","Type":"ContainerStarted","Data":"2d0d2b746567876aabdecd7c89f7e1e03a4831d8f6aad332f0604617fe0163d1"} Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.298136 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" podStartSLOduration=0.967034563 podStartE2EDuration="7.29811453s" podCreationTimestamp="2026-02-28 03:50:47 +0000 UTC" firstStartedPulling="2026-02-28 03:50:47.630313351 +0000 UTC m=+986.095882209" lastFinishedPulling="2026-02-28 03:50:53.961393318 +0000 UTC m=+992.426962176" observedRunningTime="2026-02-28 03:50:54.285765994 +0000 UTC m=+992.751334842" watchObservedRunningTime="2026-02-28 03:50:54.29811453 +0000 UTC m=+992.763683388" Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.575019 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.736792 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr99j\" (UniqueName: \"kubernetes.io/projected/e3da2c06-ff12-4661-9164-bdce48392642-kube-api-access-hr99j\") pod \"e3da2c06-ff12-4661-9164-bdce48392642\" (UID: \"e3da2c06-ff12-4661-9164-bdce48392642\") " Feb 28 03:50:54 crc kubenswrapper[4819]: E0228 03:50:54.739026 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8\": RecentStats: unable to find data in memory cache]" Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.741511 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3da2c06-ff12-4661-9164-bdce48392642-kube-api-access-hr99j" (OuterVolumeSpecName: "kube-api-access-hr99j") pod "e3da2c06-ff12-4661-9164-bdce48392642" (UID: "e3da2c06-ff12-4661-9164-bdce48392642"). InnerVolumeSpecName "kube-api-access-hr99j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:50:54 crc kubenswrapper[4819]: I0228 03:50:54.838420 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr99j\" (UniqueName: \"kubernetes.io/projected/e3da2c06-ff12-4661-9164-bdce48392642-kube-api-access-hr99j\") on node \"crc\" DevicePath \"\"" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.214807 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"81cae302-997a-482b-b76a-90b1172083b1","Type":"ContainerStarted","Data":"3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e"} Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.216662 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" event={"ID":"96706889-ea67-4e43-ab82-6a6026090647","Type":"ContainerStarted","Data":"40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378"} Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.218706 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9b16747d-1b6b-44ee-896e-0ead9587deeb","Type":"ContainerStarted","Data":"eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f"} Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.221051 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"a18d0ed2-f5db-4f32-b635-7956eeee1f01","Type":"ContainerStarted","Data":"5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb"} Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.223159 4819 generic.go:334] "Generic (PLEG): container finished" podID="e3da2c06-ff12-4661-9164-bdce48392642" containerID="c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e" exitCode=0 Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.223210 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" event={"ID":"e3da2c06-ff12-4661-9164-bdce48392642","Type":"ContainerDied","Data":"c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e"} Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.223294 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.223341 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jdfkd" event={"ID":"e3da2c06-ff12-4661-9164-bdce48392642","Type":"ContainerDied","Data":"246a893fc44f1fa6a0792c4ded347498ea33b5d1f47cf9490a01034e44ff9bd3"} Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.223387 4819 scope.go:117] "RemoveContainer" containerID="c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.242137 4819 scope.go:117] "RemoveContainer" containerID="c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e" Feb 28 03:50:55 crc kubenswrapper[4819]: E0228 03:50:55.242868 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e\": container with ID starting with c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e not found: ID does not exist" containerID="c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.242931 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e"} err="failed to get container status \"c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e\": rpc error: code = NotFound desc = could not find container \"c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e\": container with ID starting with c26b3e00f1cef8832c4080a76ae5fdce1c807bab7c7fd5a97bf00a5a65266f9e not found: ID does not exist" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.255430 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-0" podStartSLOduration=10.875689648 podStartE2EDuration="22.255410214s" podCreationTimestamp="2026-02-28 03:50:33 +0000 UTC" firstStartedPulling="2026-02-28 03:50:35.884604096 +0000 UTC m=+974.350172954" lastFinishedPulling="2026-02-28 03:50:47.264324652 +0000 UTC m=+985.729893520" observedRunningTime="2026-02-28 03:50:55.251776784 +0000 UTC m=+993.717345702" watchObservedRunningTime="2026-02-28 03:50:55.255410214 +0000 UTC m=+993.720979082" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.283770 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-1" podStartSLOduration=10.750914109 podStartE2EDuration="22.283738588s" podCreationTimestamp="2026-02-28 03:50:33 +0000 UTC" firstStartedPulling="2026-02-28 03:50:35.72935559 +0000 UTC m=+974.194924438" lastFinishedPulling="2026-02-28 03:50:47.262180059 +0000 UTC m=+985.727748917" observedRunningTime="2026-02-28 03:50:55.278104648 +0000 UTC m=+993.743673566" watchObservedRunningTime="2026-02-28 03:50:55.283738588 +0000 UTC m=+993.749307486" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.302785 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jdfkd"] Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.310963 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jdfkd"] Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.314400 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" podStartSLOduration=3.899378862 podStartE2EDuration="4.314377589s" podCreationTimestamp="2026-02-28 03:50:51 +0000 UTC" firstStartedPulling="2026-02-28 03:50:53.911393086 +0000 UTC m=+992.376961944" lastFinishedPulling="2026-02-28 03:50:54.326391803 +0000 UTC m=+992.791960671" observedRunningTime="2026-02-28 03:50:55.308650276 +0000 UTC m=+993.774219144" watchObservedRunningTime="2026-02-28 03:50:55.314377589 +0000 UTC m=+993.779946467" Feb 28 03:50:55 crc kubenswrapper[4819]: I0228 03:50:55.326391 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-2" podStartSLOduration=11.068805453 podStartE2EDuration="22.326366716s" podCreationTimestamp="2026-02-28 03:50:33 +0000 UTC" firstStartedPulling="2026-02-28 03:50:36.015485316 +0000 UTC m=+974.481054174" lastFinishedPulling="2026-02-28 03:50:47.273046579 +0000 UTC m=+985.738615437" observedRunningTime="2026-02-28 03:50:55.323938956 +0000 UTC m=+993.789507834" watchObservedRunningTime="2026-02-28 03:50:55.326366716 +0000 UTC m=+993.791935584" Feb 28 03:50:56 crc kubenswrapper[4819]: I0228 03:50:56.382567 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3da2c06-ff12-4661-9164-bdce48392642" path="/var/lib/kubelet/pods/e3da2c06-ff12-4661-9164-bdce48392642/volumes" Feb 28 03:50:56 crc kubenswrapper[4819]: I0228 03:50:56.475395 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/memcached-0" Feb 28 03:51:02 crc kubenswrapper[4819]: I0228 03:51:02.175911 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:51:02 crc kubenswrapper[4819]: I0228 03:51:02.176290 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:51:02 crc kubenswrapper[4819]: I0228 03:51:02.230234 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:51:02 crc kubenswrapper[4819]: I0228 03:51:02.940324 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.716947 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x"] Feb 28 03:51:04 crc kubenswrapper[4819]: E0228 03:51:04.717587 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3da2c06-ff12-4661-9164-bdce48392642" containerName="registry-server" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.717647 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3da2c06-ff12-4661-9164-bdce48392642" containerName="registry-server" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.718026 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3da2c06-ff12-4661-9164-bdce48392642" containerName="registry-server" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.719751 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.726280 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gzck7" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.735523 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x"] Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.783024 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.783099 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.783172 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlm8p\" (UniqueName: \"kubernetes.io/projected/65a44caf-c5d8-4b98-ad4f-16833053b82b-kube-api-access-rlm8p\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: E0228 03:51:04.883585 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8\": RecentStats: unable to find data in memory cache]" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.884193 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.883849 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.884321 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlm8p\" (UniqueName: \"kubernetes.io/projected/65a44caf-c5d8-4b98-ad4f-16833053b82b-kube-api-access-rlm8p\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.884362 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.884657 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:04 crc kubenswrapper[4819]: I0228 03:51:04.915159 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlm8p\" (UniqueName: \"kubernetes.io/projected/65a44caf-c5d8-4b98-ad4f-16833053b82b-kube-api-access-rlm8p\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.076069 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.180822 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.181030 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.200873 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.200918 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.212634 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.212677 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.300948 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x"] Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.382219 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.929593 4819 generic.go:334] "Generic (PLEG): container finished" podID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerID="523d8d766f60d5aa31d4dc3fd74b03f3f3318e46b28e33be6a2ff5a06367f63f" exitCode=0 Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.929698 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" event={"ID":"65a44caf-c5d8-4b98-ad4f-16833053b82b","Type":"ContainerDied","Data":"523d8d766f60d5aa31d4dc3fd74b03f3f3318e46b28e33be6a2ff5a06367f63f"} Feb 28 03:51:05 crc kubenswrapper[4819]: I0228 03:51:05.930133 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" event={"ID":"65a44caf-c5d8-4b98-ad4f-16833053b82b","Type":"ContainerStarted","Data":"5a2ff8faeccc0c7c56e0d4800e2f13cde708693cdd44f23e1d318a250596cc02"} Feb 28 03:51:06 crc kubenswrapper[4819]: I0228 03:51:06.051218 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:51:07 crc kubenswrapper[4819]: I0228 03:51:07.970184 4819 generic.go:334] "Generic (PLEG): container finished" podID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerID="22652fedb265636c3a9ade8a81db4e542658bbf7662ee83c711e12c19a56902c" exitCode=0 Feb 28 03:51:07 crc kubenswrapper[4819]: I0228 03:51:07.970296 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" event={"ID":"65a44caf-c5d8-4b98-ad4f-16833053b82b","Type":"ContainerDied","Data":"22652fedb265636c3a9ade8a81db4e542658bbf7662ee83c711e12c19a56902c"} Feb 28 03:51:08 crc kubenswrapper[4819]: I0228 03:51:08.980184 4819 generic.go:334] "Generic (PLEG): container finished" podID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerID="7cb67bd71228f5246ee73d9ec3627ce72963ad2cc2b1308cef1dfeb60b347973" exitCode=0 Feb 28 03:51:08 crc kubenswrapper[4819]: I0228 03:51:08.980492 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" event={"ID":"65a44caf-c5d8-4b98-ad4f-16833053b82b","Type":"ContainerDied","Data":"7cb67bd71228f5246ee73d9ec3627ce72963ad2cc2b1308cef1dfeb60b347973"} Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.413035 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.474609 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-util\") pod \"65a44caf-c5d8-4b98-ad4f-16833053b82b\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.474679 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlm8p\" (UniqueName: \"kubernetes.io/projected/65a44caf-c5d8-4b98-ad4f-16833053b82b-kube-api-access-rlm8p\") pod \"65a44caf-c5d8-4b98-ad4f-16833053b82b\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.474705 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-bundle\") pod \"65a44caf-c5d8-4b98-ad4f-16833053b82b\" (UID: \"65a44caf-c5d8-4b98-ad4f-16833053b82b\") " Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.475494 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-bundle" (OuterVolumeSpecName: "bundle") pod "65a44caf-c5d8-4b98-ad4f-16833053b82b" (UID: "65a44caf-c5d8-4b98-ad4f-16833053b82b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.476203 4819 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.480920 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a44caf-c5d8-4b98-ad4f-16833053b82b-kube-api-access-rlm8p" (OuterVolumeSpecName: "kube-api-access-rlm8p") pod "65a44caf-c5d8-4b98-ad4f-16833053b82b" (UID: "65a44caf-c5d8-4b98-ad4f-16833053b82b"). InnerVolumeSpecName "kube-api-access-rlm8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.577997 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlm8p\" (UniqueName: \"kubernetes.io/projected/65a44caf-c5d8-4b98-ad4f-16833053b82b-kube-api-access-rlm8p\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.820704 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-util" (OuterVolumeSpecName: "util") pod "65a44caf-c5d8-4b98-ad4f-16833053b82b" (UID: "65a44caf-c5d8-4b98-ad4f-16833053b82b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:51:10 crc kubenswrapper[4819]: I0228 03:51:10.882590 4819 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65a44caf-c5d8-4b98-ad4f-16833053b82b-util\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:11 crc kubenswrapper[4819]: I0228 03:51:11.004403 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" event={"ID":"65a44caf-c5d8-4b98-ad4f-16833053b82b","Type":"ContainerDied","Data":"5a2ff8faeccc0c7c56e0d4800e2f13cde708693cdd44f23e1d318a250596cc02"} Feb 28 03:51:11 crc kubenswrapper[4819]: I0228 03:51:11.004489 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a2ff8faeccc0c7c56e0d4800e2f13cde708693cdd44f23e1d318a250596cc02" Feb 28 03:51:11 crc kubenswrapper[4819]: I0228 03:51:11.004440 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.671073 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hd5pb"] Feb 28 03:51:12 crc kubenswrapper[4819]: E0228 03:51:12.671584 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="extract" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.671610 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="extract" Feb 28 03:51:12 crc kubenswrapper[4819]: E0228 03:51:12.671666 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="pull" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.671679 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="pull" Feb 28 03:51:12 crc kubenswrapper[4819]: E0228 03:51:12.671705 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="util" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.671718 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="util" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.671906 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" containerName="extract" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.673691 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.689653 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd5pb"] Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.707466 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-catalog-content\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.707564 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-utilities\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.707604 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xlk\" (UniqueName: \"kubernetes.io/projected/55a3cedd-dc73-43f9-b183-86d558dadc9e-kube-api-access-r2xlk\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.809355 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-catalog-content\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.810030 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-catalog-content\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.810197 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-utilities\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.810367 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xlk\" (UniqueName: \"kubernetes.io/projected/55a3cedd-dc73-43f9-b183-86d558dadc9e-kube-api-access-r2xlk\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.810746 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-utilities\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:12 crc kubenswrapper[4819]: I0228 03:51:12.834794 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xlk\" (UniqueName: \"kubernetes.io/projected/55a3cedd-dc73-43f9-b183-86d558dadc9e-kube-api-access-r2xlk\") pod \"community-operators-hd5pb\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:13 crc kubenswrapper[4819]: I0228 03:51:13.001062 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:13 crc kubenswrapper[4819]: I0228 03:51:13.908221 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-hnkqn"] Feb 28 03:51:13 crc kubenswrapper[4819]: I0228 03:51:13.909118 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:13 crc kubenswrapper[4819]: I0228 03:51:13.912596 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Feb 28 03:51:13 crc kubenswrapper[4819]: I0228 03:51:13.917693 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-hnkqn"] Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.026996 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvsgw\" (UniqueName: \"kubernetes.io/projected/5424ceda-ffd2-4956-92f4-98e527ee26d3-kube-api-access-hvsgw\") pod \"root-account-create-update-hnkqn\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.027421 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5424ceda-ffd2-4956-92f4-98e527ee26d3-operator-scripts\") pod \"root-account-create-update-hnkqn\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.129756 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5424ceda-ffd2-4956-92f4-98e527ee26d3-operator-scripts\") pod \"root-account-create-update-hnkqn\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.129962 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvsgw\" (UniqueName: \"kubernetes.io/projected/5424ceda-ffd2-4956-92f4-98e527ee26d3-kube-api-access-hvsgw\") pod \"root-account-create-update-hnkqn\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.130455 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5424ceda-ffd2-4956-92f4-98e527ee26d3-operator-scripts\") pod \"root-account-create-update-hnkqn\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.150439 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvsgw\" (UniqueName: \"kubernetes.io/projected/5424ceda-ffd2-4956-92f4-98e527ee26d3-kube-api-access-hvsgw\") pod \"root-account-create-update-hnkqn\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.223127 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.490825 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd5pb"] Feb 28 03:51:14 crc kubenswrapper[4819]: I0228 03:51:14.532318 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-hnkqn"] Feb 28 03:51:15 crc kubenswrapper[4819]: I0228 03:51:15.044903 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" event={"ID":"5424ceda-ffd2-4956-92f4-98e527ee26d3","Type":"ContainerStarted","Data":"13770e577b03cf2eec3db7527e07603547f0e88833f1a6d1b5a6ceb500310b2d"} Feb 28 03:51:15 crc kubenswrapper[4819]: I0228 03:51:15.048860 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerStarted","Data":"4b5b5a8142cc485d44791f52f96ff657df3880a0282bb0700d6fb9eac973d378"} Feb 28 03:51:15 crc kubenswrapper[4819]: E0228 03:51:15.090901 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f3677a_e092_464a_925c_c0792c7590da.slice/crio-22e993bfc06cbc6fc03a047b09c3e2a10b57b7c8f841dd8ab0310015fc6e73f8\": RecentStats: unable to find data in memory cache]" Feb 28 03:51:15 crc kubenswrapper[4819]: I0228 03:51:15.350819 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/openstack-galera-2" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="galera" probeResult="failure" output=< Feb 28 03:51:15 crc kubenswrapper[4819]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Feb 28 03:51:15 crc kubenswrapper[4819]: > Feb 28 03:51:16 crc kubenswrapper[4819]: I0228 03:51:16.058423 4819 generic.go:334] "Generic (PLEG): container finished" podID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerID="3648a447641f453cf5c1c39222632374efd0b3fa62470ddf0066657a4a3bde17" exitCode=0 Feb 28 03:51:16 crc kubenswrapper[4819]: I0228 03:51:16.058556 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerDied","Data":"3648a447641f453cf5c1c39222632374efd0b3fa62470ddf0066657a4a3bde17"} Feb 28 03:51:16 crc kubenswrapper[4819]: I0228 03:51:16.061379 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" event={"ID":"5424ceda-ffd2-4956-92f4-98e527ee26d3","Type":"ContainerStarted","Data":"c82b7d129f999d46def29cb424699c9eda5294bd13fd03ca32168648bfc810ad"} Feb 28 03:51:16 crc kubenswrapper[4819]: I0228 03:51:16.119669 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" podStartSLOduration=3.119654722 podStartE2EDuration="3.119654722s" podCreationTimestamp="2026-02-28 03:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:51:16.115379356 +0000 UTC m=+1014.580948214" watchObservedRunningTime="2026-02-28 03:51:16.119654722 +0000 UTC m=+1014.585223580" Feb 28 03:51:17 crc kubenswrapper[4819]: I0228 03:51:17.071915 4819 generic.go:334] "Generic (PLEG): container finished" podID="5424ceda-ffd2-4956-92f4-98e527ee26d3" containerID="c82b7d129f999d46def29cb424699c9eda5294bd13fd03ca32168648bfc810ad" exitCode=0 Feb 28 03:51:17 crc kubenswrapper[4819]: I0228 03:51:17.072101 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" event={"ID":"5424ceda-ffd2-4956-92f4-98e527ee26d3","Type":"ContainerDied","Data":"c82b7d129f999d46def29cb424699c9eda5294bd13fd03ca32168648bfc810ad"} Feb 28 03:51:17 crc kubenswrapper[4819]: I0228 03:51:17.074285 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerStarted","Data":"7adb9cbc4266db23b6f3a5b4849b9fc93347246d73ca4a4a8c83038aa5240ede"} Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.083420 4819 generic.go:334] "Generic (PLEG): container finished" podID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerID="7adb9cbc4266db23b6f3a5b4849b9fc93347246d73ca4a4a8c83038aa5240ede" exitCode=0 Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.083507 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerDied","Data":"7adb9cbc4266db23b6f3a5b4849b9fc93347246d73ca4a4a8c83038aa5240ede"} Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.445883 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.508907 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5424ceda-ffd2-4956-92f4-98e527ee26d3-operator-scripts\") pod \"5424ceda-ffd2-4956-92f4-98e527ee26d3\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.509476 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvsgw\" (UniqueName: \"kubernetes.io/projected/5424ceda-ffd2-4956-92f4-98e527ee26d3-kube-api-access-hvsgw\") pod \"5424ceda-ffd2-4956-92f4-98e527ee26d3\" (UID: \"5424ceda-ffd2-4956-92f4-98e527ee26d3\") " Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.509872 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5424ceda-ffd2-4956-92f4-98e527ee26d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5424ceda-ffd2-4956-92f4-98e527ee26d3" (UID: "5424ceda-ffd2-4956-92f4-98e527ee26d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.510233 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5424ceda-ffd2-4956-92f4-98e527ee26d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.525163 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5424ceda-ffd2-4956-92f4-98e527ee26d3-kube-api-access-hvsgw" (OuterVolumeSpecName: "kube-api-access-hvsgw") pod "5424ceda-ffd2-4956-92f4-98e527ee26d3" (UID: "5424ceda-ffd2-4956-92f4-98e527ee26d3"). InnerVolumeSpecName "kube-api-access-hvsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:51:18 crc kubenswrapper[4819]: I0228 03:51:18.611849 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvsgw\" (UniqueName: \"kubernetes.io/projected/5424ceda-ffd2-4956-92f4-98e527ee26d3-kube-api-access-hvsgw\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.093838 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" event={"ID":"5424ceda-ffd2-4956-92f4-98e527ee26d3","Type":"ContainerDied","Data":"13770e577b03cf2eec3db7527e07603547f0e88833f1a6d1b5a6ceb500310b2d"} Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.093891 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13770e577b03cf2eec3db7527e07603547f0e88833f1a6d1b5a6ceb500310b2d" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.093902 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-hnkqn" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.130056 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5"] Feb 28 03:51:19 crc kubenswrapper[4819]: E0228 03:51:19.130543 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5424ceda-ffd2-4956-92f4-98e527ee26d3" containerName="mariadb-account-create-update" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.130582 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5424ceda-ffd2-4956-92f4-98e527ee26d3" containerName="mariadb-account-create-update" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.130848 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="5424ceda-ffd2-4956-92f4-98e527ee26d3" containerName="mariadb-account-create-update" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.131708 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.135789 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-gzp9t" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.136751 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5"] Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.257625 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-69p9f"] Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.259633 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.274478 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69p9f"] Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.320307 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ssk\" (UniqueName: \"kubernetes.io/projected/58dd9e5c-5ce0-4cef-b287-413997f8aa49-kube-api-access-z9ssk\") pod \"rabbitmq-cluster-operator-779fc9694b-gzvt5\" (UID: \"58dd9e5c-5ce0-4cef-b287-413997f8aa49\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.421523 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxl2f\" (UniqueName: \"kubernetes.io/projected/d47914cf-d85e-46ad-a1d2-597f4e670d66-kube-api-access-xxl2f\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.422099 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ssk\" (UniqueName: \"kubernetes.io/projected/58dd9e5c-5ce0-4cef-b287-413997f8aa49-kube-api-access-z9ssk\") pod \"rabbitmq-cluster-operator-779fc9694b-gzvt5\" (UID: \"58dd9e5c-5ce0-4cef-b287-413997f8aa49\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.422180 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-utilities\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.422222 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-catalog-content\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.459639 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ssk\" (UniqueName: \"kubernetes.io/projected/58dd9e5c-5ce0-4cef-b287-413997f8aa49-kube-api-access-z9ssk\") pod \"rabbitmq-cluster-operator-779fc9694b-gzvt5\" (UID: \"58dd9e5c-5ce0-4cef-b287-413997f8aa49\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.524090 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxl2f\" (UniqueName: \"kubernetes.io/projected/d47914cf-d85e-46ad-a1d2-597f4e670d66-kube-api-access-xxl2f\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.524528 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-utilities\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.524556 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-catalog-content\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.524977 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-utilities\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.525012 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-catalog-content\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.561405 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxl2f\" (UniqueName: \"kubernetes.io/projected/d47914cf-d85e-46ad-a1d2-597f4e670d66-kube-api-access-xxl2f\") pod \"certified-operators-69p9f\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.578863 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:19 crc kubenswrapper[4819]: I0228 03:51:19.753811 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:51:20 crc kubenswrapper[4819]: I0228 03:51:20.088893 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69p9f"] Feb 28 03:51:20 crc kubenswrapper[4819]: I0228 03:51:20.110185 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerStarted","Data":"ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905"} Feb 28 03:51:20 crc kubenswrapper[4819]: I0228 03:51:20.124197 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hd5pb" podStartSLOduration=5.375059229 podStartE2EDuration="8.124182121s" podCreationTimestamp="2026-02-28 03:51:12 +0000 UTC" firstStartedPulling="2026-02-28 03:51:16.06118464 +0000 UTC m=+1014.526753498" lastFinishedPulling="2026-02-28 03:51:18.810307502 +0000 UTC m=+1017.275876390" observedRunningTime="2026-02-28 03:51:20.121961906 +0000 UTC m=+1018.587530764" watchObservedRunningTime="2026-02-28 03:51:20.124182121 +0000 UTC m=+1018.589750979" Feb 28 03:51:20 crc kubenswrapper[4819]: I0228 03:51:20.212553 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5"] Feb 28 03:51:20 crc kubenswrapper[4819]: W0228 03:51:20.220057 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod58dd9e5c_5ce0_4cef_b287_413997f8aa49.slice/crio-28b21015957886a85e3600963c2e22becd88991b7598157d9ac33064dc26598e WatchSource:0}: Error finding container 28b21015957886a85e3600963c2e22becd88991b7598157d9ac33064dc26598e: Status 404 returned error can't find the container with id 28b21015957886a85e3600963c2e22becd88991b7598157d9ac33064dc26598e Feb 28 03:51:21 crc kubenswrapper[4819]: I0228 03:51:21.118959 4819 generic.go:334] "Generic (PLEG): container finished" podID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerID="eaf9522f1b727c86effddd0c7043d79bc186a74a9693f09fa9cede776e416a46" exitCode=0 Feb 28 03:51:21 crc kubenswrapper[4819]: I0228 03:51:21.119043 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerDied","Data":"eaf9522f1b727c86effddd0c7043d79bc186a74a9693f09fa9cede776e416a46"} Feb 28 03:51:21 crc kubenswrapper[4819]: I0228 03:51:21.119329 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerStarted","Data":"cc86fdd4242c76049768ba934c3bf1fdce7153935e6e793575ca9dfa3fe41e48"} Feb 28 03:51:21 crc kubenswrapper[4819]: I0228 03:51:21.127575 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" event={"ID":"58dd9e5c-5ce0-4cef-b287-413997f8aa49","Type":"ContainerStarted","Data":"28b21015957886a85e3600963c2e22becd88991b7598157d9ac33064dc26598e"} Feb 28 03:51:22 crc kubenswrapper[4819]: I0228 03:51:22.134072 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:51:22 crc kubenswrapper[4819]: I0228 03:51:22.137947 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerStarted","Data":"9b85d7584ff0bf167ed78c7bc593b342818c76fc41701f3fd035a1e56989f5b1"} Feb 28 03:51:22 crc kubenswrapper[4819]: I0228 03:51:22.246939 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:51:23 crc kubenswrapper[4819]: I0228 03:51:23.001297 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:23 crc kubenswrapper[4819]: I0228 03:51:23.001638 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:23 crc kubenswrapper[4819]: I0228 03:51:23.041889 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:23 crc kubenswrapper[4819]: I0228 03:51:23.146934 4819 generic.go:334] "Generic (PLEG): container finished" podID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerID="9b85d7584ff0bf167ed78c7bc593b342818c76fc41701f3fd035a1e56989f5b1" exitCode=0 Feb 28 03:51:23 crc kubenswrapper[4819]: I0228 03:51:23.147037 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerDied","Data":"9b85d7584ff0bf167ed78c7bc593b342818c76fc41701f3fd035a1e56989f5b1"} Feb 28 03:51:24 crc kubenswrapper[4819]: I0228 03:51:24.716634 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:51:24 crc kubenswrapper[4819]: I0228 03:51:24.805361 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:51:25 crc kubenswrapper[4819]: I0228 03:51:25.167765 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" event={"ID":"58dd9e5c-5ce0-4cef-b287-413997f8aa49","Type":"ContainerStarted","Data":"8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918"} Feb 28 03:51:25 crc kubenswrapper[4819]: I0228 03:51:25.171211 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerStarted","Data":"a7d31e86e2dd6ae9e7f16d88e2550b46002decc2e4cbbeea489d5137c6e95e8b"} Feb 28 03:51:25 crc kubenswrapper[4819]: I0228 03:51:25.191517 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" podStartSLOduration=1.938798201 podStartE2EDuration="6.191497123s" podCreationTimestamp="2026-02-28 03:51:19 +0000 UTC" firstStartedPulling="2026-02-28 03:51:20.224153844 +0000 UTC m=+1018.689722692" lastFinishedPulling="2026-02-28 03:51:24.476852756 +0000 UTC m=+1022.942421614" observedRunningTime="2026-02-28 03:51:25.183646028 +0000 UTC m=+1023.649214916" watchObservedRunningTime="2026-02-28 03:51:25.191497123 +0000 UTC m=+1023.657065981" Feb 28 03:51:29 crc kubenswrapper[4819]: I0228 03:51:29.579870 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:29 crc kubenswrapper[4819]: I0228 03:51:29.580376 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:29 crc kubenswrapper[4819]: I0228 03:51:29.664332 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:29 crc kubenswrapper[4819]: I0228 03:51:29.691093 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-69p9f" podStartSLOduration=7.352686149 podStartE2EDuration="10.691067746s" podCreationTimestamp="2026-02-28 03:51:19 +0000 UTC" firstStartedPulling="2026-02-28 03:51:21.122150154 +0000 UTC m=+1019.587719022" lastFinishedPulling="2026-02-28 03:51:24.460531751 +0000 UTC m=+1022.926100619" observedRunningTime="2026-02-28 03:51:25.21272039 +0000 UTC m=+1023.678289258" watchObservedRunningTime="2026-02-28 03:51:29.691067746 +0000 UTC m=+1028.156636604" Feb 28 03:51:30 crc kubenswrapper[4819]: I0228 03:51:30.287937 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.269045 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k9pp9"] Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.271120 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.284701 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9pp9"] Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.424227 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-utilities\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.424334 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctdh\" (UniqueName: \"kubernetes.io/projected/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-kube-api-access-tctdh\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.424389 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-catalog-content\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.526291 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-utilities\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.526386 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctdh\" (UniqueName: \"kubernetes.io/projected/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-kube-api-access-tctdh\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.526433 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-catalog-content\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.528966 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-catalog-content\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.529413 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-utilities\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.554394 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctdh\" (UniqueName: \"kubernetes.io/projected/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-kube-api-access-tctdh\") pod \"redhat-operators-k9pp9\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.604031 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.848779 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69p9f"] Feb 28 03:51:32 crc kubenswrapper[4819]: I0228 03:51:32.849353 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-69p9f" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="registry-server" containerID="cri-o://a7d31e86e2dd6ae9e7f16d88e2550b46002decc2e4cbbeea489d5137c6e95e8b" gracePeriod=2 Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.058655 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.061240 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9pp9"] Feb 28 03:51:33 crc kubenswrapper[4819]: W0228 03:51:33.065754 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0d1e7c_1d4d_48d1_9c22_d3fa153bb458.slice/crio-5fa3fa2baf88fa7aeb7889897f2ff5cc704d021e83f8d5d118d0acee15df894b WatchSource:0}: Error finding container 5fa3fa2baf88fa7aeb7889897f2ff5cc704d021e83f8d5d118d0acee15df894b: Status 404 returned error can't find the container with id 5fa3fa2baf88fa7aeb7889897f2ff5cc704d021e83f8d5d118d0acee15df894b Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.234601 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerStarted","Data":"49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50"} Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.234641 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerStarted","Data":"5fa3fa2baf88fa7aeb7889897f2ff5cc704d021e83f8d5d118d0acee15df894b"} Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.236579 4819 generic.go:334] "Generic (PLEG): container finished" podID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerID="a7d31e86e2dd6ae9e7f16d88e2550b46002decc2e4cbbeea489d5137c6e95e8b" exitCode=0 Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.236608 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerDied","Data":"a7d31e86e2dd6ae9e7f16d88e2550b46002decc2e4cbbeea489d5137c6e95e8b"} Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.266263 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.439529 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxl2f\" (UniqueName: \"kubernetes.io/projected/d47914cf-d85e-46ad-a1d2-597f4e670d66-kube-api-access-xxl2f\") pod \"d47914cf-d85e-46ad-a1d2-597f4e670d66\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.439642 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-catalog-content\") pod \"d47914cf-d85e-46ad-a1d2-597f4e670d66\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.439765 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-utilities\") pod \"d47914cf-d85e-46ad-a1d2-597f4e670d66\" (UID: \"d47914cf-d85e-46ad-a1d2-597f4e670d66\") " Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.440523 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-utilities" (OuterVolumeSpecName: "utilities") pod "d47914cf-d85e-46ad-a1d2-597f4e670d66" (UID: "d47914cf-d85e-46ad-a1d2-597f4e670d66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.445457 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d47914cf-d85e-46ad-a1d2-597f4e670d66-kube-api-access-xxl2f" (OuterVolumeSpecName: "kube-api-access-xxl2f") pod "d47914cf-d85e-46ad-a1d2-597f4e670d66" (UID: "d47914cf-d85e-46ad-a1d2-597f4e670d66"). InnerVolumeSpecName "kube-api-access-xxl2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.542206 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:33 crc kubenswrapper[4819]: I0228 03:51:33.542511 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxl2f\" (UniqueName: \"kubernetes.io/projected/d47914cf-d85e-46ad-a1d2-597f4e670d66-kube-api-access-xxl2f\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.249167 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69p9f" event={"ID":"d47914cf-d85e-46ad-a1d2-597f4e670d66","Type":"ContainerDied","Data":"cc86fdd4242c76049768ba934c3bf1fdce7153935e6e793575ca9dfa3fe41e48"} Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.249440 4819 scope.go:117] "RemoveContainer" containerID="a7d31e86e2dd6ae9e7f16d88e2550b46002decc2e4cbbeea489d5137c6e95e8b" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.249451 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69p9f" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.252757 4819 generic.go:334] "Generic (PLEG): container finished" podID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerID="49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50" exitCode=0 Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.252781 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerDied","Data":"49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50"} Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.285503 4819 scope.go:117] "RemoveContainer" containerID="9b85d7584ff0bf167ed78c7bc593b342818c76fc41701f3fd035a1e56989f5b1" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.311993 4819 scope.go:117] "RemoveContainer" containerID="eaf9522f1b727c86effddd0c7043d79bc186a74a9693f09fa9cede776e416a46" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.510009 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d47914cf-d85e-46ad-a1d2-597f4e670d66" (UID: "d47914cf-d85e-46ad-a1d2-597f4e670d66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.557687 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47914cf-d85e-46ad-a1d2-597f4e670d66-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.583137 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69p9f"] Feb 28 03:51:34 crc kubenswrapper[4819]: I0228 03:51:34.597532 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-69p9f"] Feb 28 03:51:35 crc kubenswrapper[4819]: I0228 03:51:35.263846 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerStarted","Data":"5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454"} Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.259787 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-92zwp"] Feb 28 03:51:36 crc kubenswrapper[4819]: E0228 03:51:36.260582 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="extract-utilities" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.260606 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="extract-utilities" Feb 28 03:51:36 crc kubenswrapper[4819]: E0228 03:51:36.260663 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="registry-server" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.260703 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="registry-server" Feb 28 03:51:36 crc kubenswrapper[4819]: E0228 03:51:36.260727 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="extract-content" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.260741 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="extract-content" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.260971 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" containerName="registry-server" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.261687 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.264981 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-4bzwb" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.280132 4819 generic.go:334] "Generic (PLEG): container finished" podID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerID="5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454" exitCode=0 Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.280207 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerDied","Data":"5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454"} Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.282233 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-92zwp"] Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.383790 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d47914cf-d85e-46ad-a1d2-597f4e670d66" path="/var/lib/kubelet/pods/d47914cf-d85e-46ad-a1d2-597f4e670d66/volumes" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.391218 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxhm\" (UniqueName: \"kubernetes.io/projected/967804ee-64cc-4594-900d-be115f006e13-kube-api-access-kgxhm\") pod \"keystone-operator-index-92zwp\" (UID: \"967804ee-64cc-4594-900d-be115f006e13\") " pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.492797 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxhm\" (UniqueName: \"kubernetes.io/projected/967804ee-64cc-4594-900d-be115f006e13-kube-api-access-kgxhm\") pod \"keystone-operator-index-92zwp\" (UID: \"967804ee-64cc-4594-900d-be115f006e13\") " pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.519274 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxhm\" (UniqueName: \"kubernetes.io/projected/967804ee-64cc-4594-900d-be115f006e13-kube-api-access-kgxhm\") pod \"keystone-operator-index-92zwp\" (UID: \"967804ee-64cc-4594-900d-be115f006e13\") " pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:36 crc kubenswrapper[4819]: I0228 03:51:36.606901 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:37 crc kubenswrapper[4819]: I0228 03:51:37.073126 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-92zwp"] Feb 28 03:51:37 crc kubenswrapper[4819]: I0228 03:51:37.287467 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-92zwp" event={"ID":"967804ee-64cc-4594-900d-be115f006e13","Type":"ContainerStarted","Data":"6265ad18864e48e6b8bbad22fc11603ecd8b4c2d24ce7bd7ea8e07da1b920fb3"} Feb 28 03:51:37 crc kubenswrapper[4819]: I0228 03:51:37.291018 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerStarted","Data":"91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e"} Feb 28 03:51:37 crc kubenswrapper[4819]: I0228 03:51:37.315307 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k9pp9" podStartSLOduration=2.826575193 podStartE2EDuration="5.315277628s" podCreationTimestamp="2026-02-28 03:51:32 +0000 UTC" firstStartedPulling="2026-02-28 03:51:34.254828894 +0000 UTC m=+1032.720397792" lastFinishedPulling="2026-02-28 03:51:36.743531299 +0000 UTC m=+1035.209100227" observedRunningTime="2026-02-28 03:51:37.309470763 +0000 UTC m=+1035.775039631" watchObservedRunningTime="2026-02-28 03:51:37.315277628 +0000 UTC m=+1035.780846496" Feb 28 03:51:41 crc kubenswrapper[4819]: I0228 03:51:41.846517 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hd5pb"] Feb 28 03:51:41 crc kubenswrapper[4819]: I0228 03:51:41.847098 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hd5pb" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="registry-server" containerID="cri-o://ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905" gracePeriod=2 Feb 28 03:51:42 crc kubenswrapper[4819]: I0228 03:51:42.604366 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:42 crc kubenswrapper[4819]: I0228 03:51:42.604904 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:43 crc kubenswrapper[4819]: E0228 03:51:43.002408 4819 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905 is running failed: container process not found" containerID="ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905" cmd=["grpc_health_probe","-addr=:50051"] Feb 28 03:51:43 crc kubenswrapper[4819]: E0228 03:51:43.002953 4819 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905 is running failed: container process not found" containerID="ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905" cmd=["grpc_health_probe","-addr=:50051"] Feb 28 03:51:43 crc kubenswrapper[4819]: E0228 03:51:43.003444 4819 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905 is running failed: container process not found" containerID="ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905" cmd=["grpc_health_probe","-addr=:50051"] Feb 28 03:51:43 crc kubenswrapper[4819]: E0228 03:51:43.003489 4819 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-hd5pb" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="registry-server" Feb 28 03:51:43 crc kubenswrapper[4819]: I0228 03:51:43.669950 4819 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k9pp9" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="registry-server" probeResult="failure" output=< Feb 28 03:51:43 crc kubenswrapper[4819]: timeout: failed to connect service ":50051" within 1s Feb 28 03:51:43 crc kubenswrapper[4819]: > Feb 28 03:51:45 crc kubenswrapper[4819]: I0228 03:51:45.358736 4819 generic.go:334] "Generic (PLEG): container finished" podID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerID="ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905" exitCode=0 Feb 28 03:51:45 crc kubenswrapper[4819]: I0228 03:51:45.358775 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerDied","Data":"ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905"} Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.629647 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.632379 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.635744 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-server-dockercfg-268tf" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.635768 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-default-user" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.636354 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-plugins-conf" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.636622 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-server-conf" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.636781 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-erlang-cookie" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.639560 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.713542 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.714001 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8575a62-7205-495b-80ed-2c715e87cc72-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.714073 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.714121 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.714157 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8575a62-7205-495b-80ed-2c715e87cc72-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.715405 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4g8\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-kube-api-access-2n4g8\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.715712 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8575a62-7205-495b-80ed-2c715e87cc72-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.715806 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.817412 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8575a62-7205-495b-80ed-2c715e87cc72-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.818685 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.818813 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.818874 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8575a62-7205-495b-80ed-2c715e87cc72-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.819067 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4g8\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-kube-api-access-2n4g8\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.819147 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8575a62-7205-495b-80ed-2c715e87cc72-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.819927 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.821441 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.821599 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.826382 4819 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.826448 4819 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/34c0f23254917f7f8a476541583a7b5131be93ff444eb61db8dab22f24c929d1/globalmount\"" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.828429 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8575a62-7205-495b-80ed-2c715e87cc72-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.828813 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.833995 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8575a62-7205-495b-80ed-2c715e87cc72-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.837689 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.846087 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8575a62-7205-495b-80ed-2c715e87cc72-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.853822 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4g8\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-kube-api-access-2n4g8\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.881047 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") pod \"rabbitmq-server-0\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.913366 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:49 crc kubenswrapper[4819]: I0228 03:51:49.973342 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.025265 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2xlk\" (UniqueName: \"kubernetes.io/projected/55a3cedd-dc73-43f9-b183-86d558dadc9e-kube-api-access-r2xlk\") pod \"55a3cedd-dc73-43f9-b183-86d558dadc9e\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.025652 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-catalog-content\") pod \"55a3cedd-dc73-43f9-b183-86d558dadc9e\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.025866 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-utilities\") pod \"55a3cedd-dc73-43f9-b183-86d558dadc9e\" (UID: \"55a3cedd-dc73-43f9-b183-86d558dadc9e\") " Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.027451 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-utilities" (OuterVolumeSpecName: "utilities") pod "55a3cedd-dc73-43f9-b183-86d558dadc9e" (UID: "55a3cedd-dc73-43f9-b183-86d558dadc9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.031953 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a3cedd-dc73-43f9-b183-86d558dadc9e-kube-api-access-r2xlk" (OuterVolumeSpecName: "kube-api-access-r2xlk") pod "55a3cedd-dc73-43f9-b183-86d558dadc9e" (UID: "55a3cedd-dc73-43f9-b183-86d558dadc9e"). InnerVolumeSpecName "kube-api-access-r2xlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.112913 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a3cedd-dc73-43f9-b183-86d558dadc9e" (UID: "55a3cedd-dc73-43f9-b183-86d558dadc9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.129674 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.129704 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2xlk\" (UniqueName: \"kubernetes.io/projected/55a3cedd-dc73-43f9-b183-86d558dadc9e-kube-api-access-r2xlk\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.129715 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3cedd-dc73-43f9-b183-86d558dadc9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.415158 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-92zwp" event={"ID":"967804ee-64cc-4594-900d-be115f006e13","Type":"ContainerStarted","Data":"9e4ad6facbc1340ef90d01f1148163fc8e33c7f10755df06a4cad94a2218a18a"} Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.419824 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd5pb" event={"ID":"55a3cedd-dc73-43f9-b183-86d558dadc9e","Type":"ContainerDied","Data":"4b5b5a8142cc485d44791f52f96ff657df3880a0282bb0700d6fb9eac973d378"} Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.419898 4819 scope.go:117] "RemoveContainer" containerID="ce068547bd1efd4974d0537d79db91e509e7e8321e08da96d666d463dc8fc905" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.420200 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd5pb" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.452035 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-92zwp" podStartSLOduration=1.3177470900000001 podStartE2EDuration="14.452005144s" podCreationTimestamp="2026-02-28 03:51:36 +0000 UTC" firstStartedPulling="2026-02-28 03:51:37.084176268 +0000 UTC m=+1035.549745136" lastFinishedPulling="2026-02-28 03:51:50.218434332 +0000 UTC m=+1048.684003190" observedRunningTime="2026-02-28 03:51:50.440019209 +0000 UTC m=+1048.905588097" watchObservedRunningTime="2026-02-28 03:51:50.452005144 +0000 UTC m=+1048.917574042" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.455090 4819 scope.go:117] "RemoveContainer" containerID="7adb9cbc4266db23b6f3a5b4849b9fc93347246d73ca4a4a8c83038aa5240ede" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.473365 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hd5pb"] Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.477593 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hd5pb"] Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.485055 4819 scope.go:117] "RemoveContainer" containerID="3648a447641f453cf5c1c39222632374efd0b3fa62470ddf0066657a4a3bde17" Feb 28 03:51:50 crc kubenswrapper[4819]: I0228 03:51:50.568219 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:51:50 crc kubenswrapper[4819]: W0228 03:51:50.572928 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8575a62_7205_495b_80ed_2c715e87cc72.slice/crio-553758d51adb878b84a876cd4bbb2a8379b8c408688813adeb28f7c08a847dee WatchSource:0}: Error finding container 553758d51adb878b84a876cd4bbb2a8379b8c408688813adeb28f7c08a847dee: Status 404 returned error can't find the container with id 553758d51adb878b84a876cd4bbb2a8379b8c408688813adeb28f7c08a847dee Feb 28 03:51:51 crc kubenswrapper[4819]: I0228 03:51:51.436411 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"c8575a62-7205-495b-80ed-2c715e87cc72","Type":"ContainerStarted","Data":"553758d51adb878b84a876cd4bbb2a8379b8c408688813adeb28f7c08a847dee"} Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.395701 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" path="/var/lib/kubelet/pods/55a3cedd-dc73-43f9-b183-86d558dadc9e/volumes" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.661482 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.678487 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6f2h7"] Feb 28 03:51:52 crc kubenswrapper[4819]: E0228 03:51:52.678871 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="registry-server" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.678891 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="registry-server" Feb 28 03:51:52 crc kubenswrapper[4819]: E0228 03:51:52.678908 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="extract-content" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.678916 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="extract-content" Feb 28 03:51:52 crc kubenswrapper[4819]: E0228 03:51:52.678934 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="extract-utilities" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.678943 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="extract-utilities" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.679082 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a3cedd-dc73-43f9-b183-86d558dadc9e" containerName="registry-server" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.684224 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.691362 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f2h7"] Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.708232 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.787316 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-utilities\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.787366 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-catalog-content\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.787392 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzhq\" (UniqueName: \"kubernetes.io/projected/c991929a-48e3-4d5e-8fa7-bc049eda14cd-kube-api-access-8wzhq\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.889431 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-catalog-content\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.890219 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-utilities\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.890423 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzhq\" (UniqueName: \"kubernetes.io/projected/c991929a-48e3-4d5e-8fa7-bc049eda14cd-kube-api-access-8wzhq\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.890758 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-catalog-content\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.891042 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-utilities\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:52 crc kubenswrapper[4819]: I0228 03:51:52.925280 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzhq\" (UniqueName: \"kubernetes.io/projected/c991929a-48e3-4d5e-8fa7-bc049eda14cd-kube-api-access-8wzhq\") pod \"redhat-marketplace-6f2h7\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:53 crc kubenswrapper[4819]: I0228 03:51:53.011226 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:51:56 crc kubenswrapper[4819]: I0228 03:51:56.536378 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f2h7"] Feb 28 03:51:56 crc kubenswrapper[4819]: I0228 03:51:56.607498 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:56 crc kubenswrapper[4819]: I0228 03:51:56.607640 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:56 crc kubenswrapper[4819]: I0228 03:51:56.644980 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:57 crc kubenswrapper[4819]: I0228 03:51:57.485951 4819 generic.go:334] "Generic (PLEG): container finished" podID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerID="143fbe11360d1bb036cf3e907e18e152907d980fd9299741787f164da7b27cf1" exitCode=0 Feb 28 03:51:57 crc kubenswrapper[4819]: I0228 03:51:57.486063 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerDied","Data":"143fbe11360d1bb036cf3e907e18e152907d980fd9299741787f164da7b27cf1"} Feb 28 03:51:57 crc kubenswrapper[4819]: I0228 03:51:57.486401 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerStarted","Data":"fff2d79b5235fc63a3b1b3a11ff5a96240cc0ff2f784b99828c0bb1543588744"} Feb 28 03:51:57 crc kubenswrapper[4819]: I0228 03:51:57.530021 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:51:58 crc kubenswrapper[4819]: I0228 03:51:58.499152 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"c8575a62-7205-495b-80ed-2c715e87cc72","Type":"ContainerStarted","Data":"550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa"} Feb 28 03:51:58 crc kubenswrapper[4819]: I0228 03:51:58.504454 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerStarted","Data":"ad1837dc5c632e925c7ed6d85c54178c1006bf96d89249ed134ad8544bf9b6f9"} Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.503263 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn"] Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.506987 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.518978 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gzck7" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.537012 4819 generic.go:334] "Generic (PLEG): container finished" podID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerID="ad1837dc5c632e925c7ed6d85c54178c1006bf96d89249ed134ad8544bf9b6f9" exitCode=0 Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.537202 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerDied","Data":"ad1837dc5c632e925c7ed6d85c54178c1006bf96d89249ed134ad8544bf9b6f9"} Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.552564 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn"] Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.601074 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mdk\" (UniqueName: \"kubernetes.io/projected/778a63de-9182-440d-9a6c-b9526ccb40fe-kube-api-access-99mdk\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.601169 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-util\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.601228 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-bundle\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.702852 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-util\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.702948 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-bundle\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.703014 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99mdk\" (UniqueName: \"kubernetes.io/projected/778a63de-9182-440d-9a6c-b9526ccb40fe-kube-api-access-99mdk\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.703967 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-bundle\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.704217 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-util\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.736302 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99mdk\" (UniqueName: \"kubernetes.io/projected/778a63de-9182-440d-9a6c-b9526ccb40fe-kube-api-access-99mdk\") pod \"8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:51:59 crc kubenswrapper[4819]: I0228 03:51:59.843015 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.141557 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537512-hwm9s"] Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.142865 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.144899 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.146186 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.146459 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.162272 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537512-hwm9s"] Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.213060 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxvx2\" (UniqueName: \"kubernetes.io/projected/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7-kube-api-access-xxvx2\") pod \"auto-csr-approver-29537512-hwm9s\" (UID: \"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7\") " pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.314034 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxvx2\" (UniqueName: \"kubernetes.io/projected/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7-kube-api-access-xxvx2\") pod \"auto-csr-approver-29537512-hwm9s\" (UID: \"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7\") " pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.337827 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxvx2\" (UniqueName: \"kubernetes.io/projected/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7-kube-api-access-xxvx2\") pod \"auto-csr-approver-29537512-hwm9s\" (UID: \"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7\") " pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.394310 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn"] Feb 28 03:52:00 crc kubenswrapper[4819]: W0228 03:52:00.397751 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod778a63de_9182_440d_9a6c_b9526ccb40fe.slice/crio-b5bfa4ff1c369c4a66cbc342189b4c4118a0f67195b770dee19c23073f4ccddc WatchSource:0}: Error finding container b5bfa4ff1c369c4a66cbc342189b4c4118a0f67195b770dee19c23073f4ccddc: Status 404 returned error can't find the container with id b5bfa4ff1c369c4a66cbc342189b4c4118a0f67195b770dee19c23073f4ccddc Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.468344 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.555992 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" event={"ID":"778a63de-9182-440d-9a6c-b9526ccb40fe","Type":"ContainerStarted","Data":"4ab273dfbc4bb8a4746e8c840d380434faf5007db662fe2d274f05bf7c85cd42"} Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.556291 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" event={"ID":"778a63de-9182-440d-9a6c-b9526ccb40fe","Type":"ContainerStarted","Data":"b5bfa4ff1c369c4a66cbc342189b4c4118a0f67195b770dee19c23073f4ccddc"} Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.578948 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerStarted","Data":"0a96101acb88bec56806425c1a0f677e15f8fe7c4a24b321262fa126414ec2ac"} Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.604140 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6f2h7" podStartSLOduration=6.129672003 podStartE2EDuration="8.604124177s" podCreationTimestamp="2026-02-28 03:51:52 +0000 UTC" firstStartedPulling="2026-02-28 03:51:57.487838462 +0000 UTC m=+1055.953407330" lastFinishedPulling="2026-02-28 03:51:59.962290606 +0000 UTC m=+1058.427859504" observedRunningTime="2026-02-28 03:52:00.603958934 +0000 UTC m=+1059.069527792" watchObservedRunningTime="2026-02-28 03:52:00.604124177 +0000 UTC m=+1059.069693025" Feb 28 03:52:00 crc kubenswrapper[4819]: I0228 03:52:00.877046 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537512-hwm9s"] Feb 28 03:52:00 crc kubenswrapper[4819]: W0228 03:52:00.878555 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod288a4e97_a8b4_4ba9_b55a_e69f14c6a7c7.slice/crio-97cbc9f6f824c418145f434c080da6ecc629c268a84450271e80614313aadff1 WatchSource:0}: Error finding container 97cbc9f6f824c418145f434c080da6ecc629c268a84450271e80614313aadff1: Status 404 returned error can't find the container with id 97cbc9f6f824c418145f434c080da6ecc629c268a84450271e80614313aadff1 Feb 28 03:52:01 crc kubenswrapper[4819]: I0228 03:52:01.586509 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" event={"ID":"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7","Type":"ContainerStarted","Data":"97cbc9f6f824c418145f434c080da6ecc629c268a84450271e80614313aadff1"} Feb 28 03:52:01 crc kubenswrapper[4819]: I0228 03:52:01.590201 4819 generic.go:334] "Generic (PLEG): container finished" podID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerID="4ab273dfbc4bb8a4746e8c840d380434faf5007db662fe2d274f05bf7c85cd42" exitCode=0 Feb 28 03:52:01 crc kubenswrapper[4819]: I0228 03:52:01.590314 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" event={"ID":"778a63de-9182-440d-9a6c-b9526ccb40fe","Type":"ContainerDied","Data":"4ab273dfbc4bb8a4746e8c840d380434faf5007db662fe2d274f05bf7c85cd42"} Feb 28 03:52:01 crc kubenswrapper[4819]: I0228 03:52:01.847890 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9pp9"] Feb 28 03:52:01 crc kubenswrapper[4819]: I0228 03:52:01.848577 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k9pp9" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="registry-server" containerID="cri-o://91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e" gracePeriod=2 Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.273008 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.348861 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-utilities\") pod \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.349170 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tctdh\" (UniqueName: \"kubernetes.io/projected/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-kube-api-access-tctdh\") pod \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.349291 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-catalog-content\") pod \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\" (UID: \"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458\") " Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.355041 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-utilities" (OuterVolumeSpecName: "utilities") pod "2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" (UID: "2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.357424 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-kube-api-access-tctdh" (OuterVolumeSpecName: "kube-api-access-tctdh") pod "2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" (UID: "2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458"). InnerVolumeSpecName "kube-api-access-tctdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.452187 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tctdh\" (UniqueName: \"kubernetes.io/projected/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-kube-api-access-tctdh\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.452637 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.512800 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" (UID: "2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.554012 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.601910 4819 generic.go:334] "Generic (PLEG): container finished" podID="288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7" containerID="0f483024b42efa5b30db139b11ffd97e6fe5923f275055315398edfe1490b914" exitCode=0 Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.601998 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" event={"ID":"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7","Type":"ContainerDied","Data":"0f483024b42efa5b30db139b11ffd97e6fe5923f275055315398edfe1490b914"} Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.604041 4819 generic.go:334] "Generic (PLEG): container finished" podID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerID="91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e" exitCode=0 Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.604137 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerDied","Data":"91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e"} Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.604174 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9pp9" event={"ID":"2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458","Type":"ContainerDied","Data":"5fa3fa2baf88fa7aeb7889897f2ff5cc704d021e83f8d5d118d0acee15df894b"} Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.604197 4819 scope.go:117] "RemoveContainer" containerID="91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.605092 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9pp9" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.618035 4819 generic.go:334] "Generic (PLEG): container finished" podID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerID="495028479d72ccade35bdc2249ad245f58ef913aeae108a2ae175ab0f2e65e03" exitCode=0 Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.618086 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" event={"ID":"778a63de-9182-440d-9a6c-b9526ccb40fe","Type":"ContainerDied","Data":"495028479d72ccade35bdc2249ad245f58ef913aeae108a2ae175ab0f2e65e03"} Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.640487 4819 scope.go:117] "RemoveContainer" containerID="5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.672953 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9pp9"] Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.677357 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k9pp9"] Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.688216 4819 scope.go:117] "RemoveContainer" containerID="49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.711462 4819 scope.go:117] "RemoveContainer" containerID="91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e" Feb 28 03:52:02 crc kubenswrapper[4819]: E0228 03:52:02.711924 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e\": container with ID starting with 91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e not found: ID does not exist" containerID="91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.711959 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e"} err="failed to get container status \"91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e\": rpc error: code = NotFound desc = could not find container \"91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e\": container with ID starting with 91e9592775e92aac52ef2a74bd45c39cdcb3e37828de91a3f20f0b1dd5413e1e not found: ID does not exist" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.711983 4819 scope.go:117] "RemoveContainer" containerID="5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454" Feb 28 03:52:02 crc kubenswrapper[4819]: E0228 03:52:02.712399 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454\": container with ID starting with 5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454 not found: ID does not exist" containerID="5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.712430 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454"} err="failed to get container status \"5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454\": rpc error: code = NotFound desc = could not find container \"5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454\": container with ID starting with 5d55636503614e7699a2a70e01b0a4ee13abe31cc877b6a07206b92c09c2c454 not found: ID does not exist" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.712449 4819 scope.go:117] "RemoveContainer" containerID="49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50" Feb 28 03:52:02 crc kubenswrapper[4819]: E0228 03:52:02.712702 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50\": container with ID starting with 49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50 not found: ID does not exist" containerID="49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50" Feb 28 03:52:02 crc kubenswrapper[4819]: I0228 03:52:02.712728 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50"} err="failed to get container status \"49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50\": rpc error: code = NotFound desc = could not find container \"49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50\": container with ID starting with 49dc19e6225dc13889d1745cceef04921d0dafbaa7859569b73b79cf149dac50 not found: ID does not exist" Feb 28 03:52:03 crc kubenswrapper[4819]: I0228 03:52:03.012705 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:52:03 crc kubenswrapper[4819]: I0228 03:52:03.012946 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:52:03 crc kubenswrapper[4819]: I0228 03:52:03.059497 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:52:03 crc kubenswrapper[4819]: I0228 03:52:03.633621 4819 generic.go:334] "Generic (PLEG): container finished" podID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerID="0ab3c436d4fb3a495be539e08073c5ba842461b6c84aedb5c2e399b55f58d227" exitCode=0 Feb 28 03:52:03 crc kubenswrapper[4819]: I0228 03:52:03.633753 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" event={"ID":"778a63de-9182-440d-9a6c-b9526ccb40fe","Type":"ContainerDied","Data":"0ab3c436d4fb3a495be539e08073c5ba842461b6c84aedb5c2e399b55f58d227"} Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.024692 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.079074 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxvx2\" (UniqueName: \"kubernetes.io/projected/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7-kube-api-access-xxvx2\") pod \"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7\" (UID: \"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7\") " Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.088490 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7-kube-api-access-xxvx2" (OuterVolumeSpecName: "kube-api-access-xxvx2") pod "288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7" (UID: "288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7"). InnerVolumeSpecName "kube-api-access-xxvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.181578 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxvx2\" (UniqueName: \"kubernetes.io/projected/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7-kube-api-access-xxvx2\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.385508 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" path="/var/lib/kubelet/pods/2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458/volumes" Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.645473 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" event={"ID":"288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7","Type":"ContainerDied","Data":"97cbc9f6f824c418145f434c080da6ecc629c268a84450271e80614313aadff1"} Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.645535 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97cbc9f6f824c418145f434c080da6ecc629c268a84450271e80614313aadff1" Feb 28 03:52:04 crc kubenswrapper[4819]: I0228 03:52:04.645533 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537512-hwm9s" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.018210 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.095699 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-util\") pod \"778a63de-9182-440d-9a6c-b9526ccb40fe\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.095971 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99mdk\" (UniqueName: \"kubernetes.io/projected/778a63de-9182-440d-9a6c-b9526ccb40fe-kube-api-access-99mdk\") pod \"778a63de-9182-440d-9a6c-b9526ccb40fe\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.096082 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-bundle\") pod \"778a63de-9182-440d-9a6c-b9526ccb40fe\" (UID: \"778a63de-9182-440d-9a6c-b9526ccb40fe\") " Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.096946 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-bundle" (OuterVolumeSpecName: "bundle") pod "778a63de-9182-440d-9a6c-b9526ccb40fe" (UID: "778a63de-9182-440d-9a6c-b9526ccb40fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.100111 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778a63de-9182-440d-9a6c-b9526ccb40fe-kube-api-access-99mdk" (OuterVolumeSpecName: "kube-api-access-99mdk") pod "778a63de-9182-440d-9a6c-b9526ccb40fe" (UID: "778a63de-9182-440d-9a6c-b9526ccb40fe"). InnerVolumeSpecName "kube-api-access-99mdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.120146 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-util" (OuterVolumeSpecName: "util") pod "778a63de-9182-440d-9a6c-b9526ccb40fe" (UID: "778a63de-9182-440d-9a6c-b9526ccb40fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.155987 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537506-m59k7"] Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.162030 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537506-m59k7"] Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.198100 4819 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-util\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.198129 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99mdk\" (UniqueName: \"kubernetes.io/projected/778a63de-9182-440d-9a6c-b9526ccb40fe-kube-api-access-99mdk\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.198141 4819 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/778a63de-9182-440d-9a6c-b9526ccb40fe-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.657171 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" event={"ID":"778a63de-9182-440d-9a6c-b9526ccb40fe","Type":"ContainerDied","Data":"b5bfa4ff1c369c4a66cbc342189b4c4118a0f67195b770dee19c23073f4ccddc"} Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.657585 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5bfa4ff1c369c4a66cbc342189b4c4118a0f67195b770dee19c23073f4ccddc" Feb 28 03:52:05 crc kubenswrapper[4819]: I0228 03:52:05.657320 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn" Feb 28 03:52:06 crc kubenswrapper[4819]: I0228 03:52:06.377501 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c27a9b1-4c0e-4860-9ce5-565c3f610d1f" path="/var/lib/kubelet/pods/3c27a9b1-4c0e-4860-9ce5-565c3f610d1f/volumes" Feb 28 03:52:13 crc kubenswrapper[4819]: I0228 03:52:13.053733 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:52:13 crc kubenswrapper[4819]: I0228 03:52:13.847737 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f2h7"] Feb 28 03:52:13 crc kubenswrapper[4819]: I0228 03:52:13.848011 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6f2h7" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="registry-server" containerID="cri-o://0a96101acb88bec56806425c1a0f677e15f8fe7c4a24b321262fa126414ec2ac" gracePeriod=2 Feb 28 03:52:14 crc kubenswrapper[4819]: I0228 03:52:14.747037 4819 generic.go:334] "Generic (PLEG): container finished" podID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerID="0a96101acb88bec56806425c1a0f677e15f8fe7c4a24b321262fa126414ec2ac" exitCode=0 Feb 28 03:52:14 crc kubenswrapper[4819]: I0228 03:52:14.747151 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerDied","Data":"0a96101acb88bec56806425c1a0f677e15f8fe7c4a24b321262fa126414ec2ac"} Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101045 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m"] Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101539 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="registry-server" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101550 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="registry-server" Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101567 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="util" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101575 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="util" Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101588 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="extract-utilities" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101594 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="extract-utilities" Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101610 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7" containerName="oc" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101616 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7" containerName="oc" Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101625 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="extract" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101631 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="extract" Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101640 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="extract-content" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101646 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="extract-content" Feb 28 03:52:16 crc kubenswrapper[4819]: E0228 03:52:16.101656 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="pull" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101662 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="pull" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101763 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0d1e7c-1d4d-48d1-9c22-d3fa153bb458" containerName="registry-server" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101775 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" containerName="extract" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.101784 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7" containerName="oc" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.102180 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.105521 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-r6xkf" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.109158 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.134352 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m"] Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.181644 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-apiservice-cert\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.181703 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-webhook-cert\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.181939 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhxv\" (UniqueName: \"kubernetes.io/projected/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-kube-api-access-7jhxv\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.283771 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhxv\" (UniqueName: \"kubernetes.io/projected/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-kube-api-access-7jhxv\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.283843 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-apiservice-cert\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.283868 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-webhook-cert\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.289051 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-webhook-cert\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.293707 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-apiservice-cert\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.297633 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhxv\" (UniqueName: \"kubernetes.io/projected/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-kube-api-access-7jhxv\") pod \"keystone-operator-controller-manager-7959cbcbf4-vs45m\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.421607 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:16 crc kubenswrapper[4819]: I0228 03:52:16.866135 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m"] Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.209180 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.298578 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wzhq\" (UniqueName: \"kubernetes.io/projected/c991929a-48e3-4d5e-8fa7-bc049eda14cd-kube-api-access-8wzhq\") pod \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.298712 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-utilities\") pod \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.298823 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-catalog-content\") pod \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\" (UID: \"c991929a-48e3-4d5e-8fa7-bc049eda14cd\") " Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.300174 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-utilities" (OuterVolumeSpecName: "utilities") pod "c991929a-48e3-4d5e-8fa7-bc049eda14cd" (UID: "c991929a-48e3-4d5e-8fa7-bc049eda14cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.305543 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c991929a-48e3-4d5e-8fa7-bc049eda14cd-kube-api-access-8wzhq" (OuterVolumeSpecName: "kube-api-access-8wzhq") pod "c991929a-48e3-4d5e-8fa7-bc049eda14cd" (UID: "c991929a-48e3-4d5e-8fa7-bc049eda14cd"). InnerVolumeSpecName "kube-api-access-8wzhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.330116 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c991929a-48e3-4d5e-8fa7-bc049eda14cd" (UID: "c991929a-48e3-4d5e-8fa7-bc049eda14cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.400937 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.400990 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c991929a-48e3-4d5e-8fa7-bc049eda14cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.401011 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wzhq\" (UniqueName: \"kubernetes.io/projected/c991929a-48e3-4d5e-8fa7-bc049eda14cd-kube-api-access-8wzhq\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.772289 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6f2h7" event={"ID":"c991929a-48e3-4d5e-8fa7-bc049eda14cd","Type":"ContainerDied","Data":"fff2d79b5235fc63a3b1b3a11ff5a96240cc0ff2f784b99828c0bb1543588744"} Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.772352 4819 scope.go:117] "RemoveContainer" containerID="0a96101acb88bec56806425c1a0f677e15f8fe7c4a24b321262fa126414ec2ac" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.772492 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6f2h7" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.777824 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" event={"ID":"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c","Type":"ContainerStarted","Data":"fd177a251b1db2bf83d95162348b32906e52321b80ffea711d261942676bad28"} Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.812905 4819 scope.go:117] "RemoveContainer" containerID="ad1837dc5c632e925c7ed6d85c54178c1006bf96d89249ed134ad8544bf9b6f9" Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.820945 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f2h7"] Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.831301 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6f2h7"] Feb 28 03:52:17 crc kubenswrapper[4819]: I0228 03:52:17.836783 4819 scope.go:117] "RemoveContainer" containerID="143fbe11360d1bb036cf3e907e18e152907d980fd9299741787f164da7b27cf1" Feb 28 03:52:18 crc kubenswrapper[4819]: I0228 03:52:18.385893 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" path="/var/lib/kubelet/pods/c991929a-48e3-4d5e-8fa7-bc049eda14cd/volumes" Feb 28 03:52:20 crc kubenswrapper[4819]: I0228 03:52:20.803179 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" event={"ID":"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c","Type":"ContainerStarted","Data":"080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40"} Feb 28 03:52:20 crc kubenswrapper[4819]: I0228 03:52:20.803870 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:20 crc kubenswrapper[4819]: I0228 03:52:20.835473 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" podStartSLOduration=1.275523449 podStartE2EDuration="4.835448866s" podCreationTimestamp="2026-02-28 03:52:16 +0000 UTC" firstStartedPulling="2026-02-28 03:52:16.875497394 +0000 UTC m=+1075.341066262" lastFinishedPulling="2026-02-28 03:52:20.435422821 +0000 UTC m=+1078.900991679" observedRunningTime="2026-02-28 03:52:20.829339941 +0000 UTC m=+1079.294908809" watchObservedRunningTime="2026-02-28 03:52:20.835448866 +0000 UTC m=+1079.301017754" Feb 28 03:52:26 crc kubenswrapper[4819]: I0228 03:52:26.428069 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:52:30 crc kubenswrapper[4819]: I0228 03:52:30.834553 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:52:30 crc kubenswrapper[4819]: I0228 03:52:30.835196 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:52:30 crc kubenswrapper[4819]: I0228 03:52:30.877446 4819 generic.go:334] "Generic (PLEG): container finished" podID="c8575a62-7205-495b-80ed-2c715e87cc72" containerID="550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa" exitCode=0 Feb 28 03:52:30 crc kubenswrapper[4819]: I0228 03:52:30.877528 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"c8575a62-7205-495b-80ed-2c715e87cc72","Type":"ContainerDied","Data":"550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa"} Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.852683 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-jb4rj"] Feb 28 03:52:31 crc kubenswrapper[4819]: E0228 03:52:31.853500 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="extract-utilities" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.853521 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="extract-utilities" Feb 28 03:52:31 crc kubenswrapper[4819]: E0228 03:52:31.853549 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="extract-content" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.853562 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="extract-content" Feb 28 03:52:31 crc kubenswrapper[4819]: E0228 03:52:31.853589 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="registry-server" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.853601 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="registry-server" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.853810 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c991929a-48e3-4d5e-8fa7-bc049eda14cd" containerName="registry-server" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.854391 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.857891 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-mwgbx" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.860101 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-jb4rj"] Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.886917 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"c8575a62-7205-495b-80ed-2c715e87cc72","Type":"ContainerStarted","Data":"d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3"} Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.887160 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.913006 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.472665544 podStartE2EDuration="43.91298728s" podCreationTimestamp="2026-02-28 03:51:48 +0000 UTC" firstStartedPulling="2026-02-28 03:51:50.577020109 +0000 UTC m=+1049.042588977" lastFinishedPulling="2026-02-28 03:51:57.017341845 +0000 UTC m=+1055.482910713" observedRunningTime="2026-02-28 03:52:31.906859195 +0000 UTC m=+1090.372428063" watchObservedRunningTime="2026-02-28 03:52:31.91298728 +0000 UTC m=+1090.378556158" Feb 28 03:52:31 crc kubenswrapper[4819]: I0228 03:52:31.951670 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmqb4\" (UniqueName: \"kubernetes.io/projected/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc-kube-api-access-gmqb4\") pod \"barbican-operator-index-jb4rj\" (UID: \"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc\") " pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:32 crc kubenswrapper[4819]: I0228 03:52:32.052960 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmqb4\" (UniqueName: \"kubernetes.io/projected/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc-kube-api-access-gmqb4\") pod \"barbican-operator-index-jb4rj\" (UID: \"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc\") " pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:32 crc kubenswrapper[4819]: I0228 03:52:32.073299 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmqb4\" (UniqueName: \"kubernetes.io/projected/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc-kube-api-access-gmqb4\") pod \"barbican-operator-index-jb4rj\" (UID: \"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc\") " pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:32 crc kubenswrapper[4819]: I0228 03:52:32.177951 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:32 crc kubenswrapper[4819]: I0228 03:52:32.621922 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-jb4rj"] Feb 28 03:52:32 crc kubenswrapper[4819]: W0228 03:52:32.627301 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ee77d5_3f8c_42b9_8025_6c6c73fa17fc.slice/crio-539d9bb2be4f1108e7298473daad3a65c67c95507973367f84762c8efc1092d3 WatchSource:0}: Error finding container 539d9bb2be4f1108e7298473daad3a65c67c95507973367f84762c8efc1092d3: Status 404 returned error can't find the container with id 539d9bb2be4f1108e7298473daad3a65c67c95507973367f84762c8efc1092d3 Feb 28 03:52:32 crc kubenswrapper[4819]: I0228 03:52:32.896423 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-jb4rj" event={"ID":"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc","Type":"ContainerStarted","Data":"539d9bb2be4f1108e7298473daad3a65c67c95507973367f84762c8efc1092d3"} Feb 28 03:52:35 crc kubenswrapper[4819]: I0228 03:52:35.922895 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-jb4rj" event={"ID":"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc","Type":"ContainerStarted","Data":"d1eb67778e8cb89f7837f042bd88806819c80e72bee56a9ac1e30c8f8c7b08cc"} Feb 28 03:52:35 crc kubenswrapper[4819]: I0228 03:52:35.943843 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-jb4rj" podStartSLOduration=2.7194396210000003 podStartE2EDuration="4.943819092s" podCreationTimestamp="2026-02-28 03:52:31 +0000 UTC" firstStartedPulling="2026-02-28 03:52:32.634496516 +0000 UTC m=+1091.100065374" lastFinishedPulling="2026-02-28 03:52:34.858875977 +0000 UTC m=+1093.324444845" observedRunningTime="2026-02-28 03:52:35.937716347 +0000 UTC m=+1094.403285215" watchObservedRunningTime="2026-02-28 03:52:35.943819092 +0000 UTC m=+1094.409387950" Feb 28 03:52:42 crc kubenswrapper[4819]: I0228 03:52:42.181976 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:42 crc kubenswrapper[4819]: I0228 03:52:42.182771 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:42 crc kubenswrapper[4819]: I0228 03:52:42.237630 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:43 crc kubenswrapper[4819]: I0228 03:52:43.007356 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.579386 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5"] Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.580696 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.582528 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-db-secret" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.589704 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-create-lgl97"] Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.590971 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.600111 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5"] Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.609054 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-lgl97"] Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.634798 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24f9c\" (UniqueName: \"kubernetes.io/projected/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-kube-api-access-24f9c\") pod \"keystone-ee83-account-create-update-5qtv5\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.634980 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-operator-scripts\") pod \"keystone-ee83-account-create-update-5qtv5\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.737321 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqws\" (UniqueName: \"kubernetes.io/projected/c6f29299-f69a-4e2c-938a-59404c33d64c-kube-api-access-xjqws\") pod \"keystone-db-create-lgl97\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.737418 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-operator-scripts\") pod \"keystone-ee83-account-create-update-5qtv5\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.737460 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6f29299-f69a-4e2c-938a-59404c33d64c-operator-scripts\") pod \"keystone-db-create-lgl97\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.737632 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24f9c\" (UniqueName: \"kubernetes.io/projected/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-kube-api-access-24f9c\") pod \"keystone-ee83-account-create-update-5qtv5\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.739032 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-operator-scripts\") pod \"keystone-ee83-account-create-update-5qtv5\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.763198 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24f9c\" (UniqueName: \"kubernetes.io/projected/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-kube-api-access-24f9c\") pod \"keystone-ee83-account-create-update-5qtv5\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.839393 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqws\" (UniqueName: \"kubernetes.io/projected/c6f29299-f69a-4e2c-938a-59404c33d64c-kube-api-access-xjqws\") pod \"keystone-db-create-lgl97\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.839506 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6f29299-f69a-4e2c-938a-59404c33d64c-operator-scripts\") pod \"keystone-db-create-lgl97\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.840386 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6f29299-f69a-4e2c-938a-59404c33d64c-operator-scripts\") pod \"keystone-db-create-lgl97\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.868121 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqws\" (UniqueName: \"kubernetes.io/projected/c6f29299-f69a-4e2c-938a-59404c33d64c-kube-api-access-xjqws\") pod \"keystone-db-create-lgl97\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.910309 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.923174 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:49 crc kubenswrapper[4819]: I0228 03:52:49.991108 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:52:50 crc kubenswrapper[4819]: I0228 03:52:50.102849 4819 scope.go:117] "RemoveContainer" containerID="5abee5d70577849f85b29cca4af7de719ed06cdde61734d4d8f214e9eb80f00d" Feb 28 03:52:50 crc kubenswrapper[4819]: I0228 03:52:50.476732 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5"] Feb 28 03:52:50 crc kubenswrapper[4819]: W0228 03:52:50.479707 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c428fea_2d2c_4e5c_9244_8eedf6cae97f.slice/crio-d844c0bd32b0a3315ae9dce1ed3c2b4c6a303e713fdba7c8a8ec2a08cb16f1d4 WatchSource:0}: Error finding container d844c0bd32b0a3315ae9dce1ed3c2b4c6a303e713fdba7c8a8ec2a08cb16f1d4: Status 404 returned error can't find the container with id d844c0bd32b0a3315ae9dce1ed3c2b4c6a303e713fdba7c8a8ec2a08cb16f1d4 Feb 28 03:52:50 crc kubenswrapper[4819]: I0228 03:52:50.492491 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-lgl97"] Feb 28 03:52:50 crc kubenswrapper[4819]: W0228 03:52:50.505538 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6f29299_f69a_4e2c_938a_59404c33d64c.slice/crio-a5b1a6f266b58bd931749701bc3f9fc60ad762c2b7f061f1dd7a984e8e1a7502 WatchSource:0}: Error finding container a5b1a6f266b58bd931749701bc3f9fc60ad762c2b7f061f1dd7a984e8e1a7502: Status 404 returned error can't find the container with id a5b1a6f266b58bd931749701bc3f9fc60ad762c2b7f061f1dd7a984e8e1a7502 Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.054831 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" event={"ID":"7c428fea-2d2c-4e5c-9244-8eedf6cae97f","Type":"ContainerStarted","Data":"d844c0bd32b0a3315ae9dce1ed3c2b4c6a303e713fdba7c8a8ec2a08cb16f1d4"} Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.056547 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-lgl97" event={"ID":"c6f29299-f69a-4e2c-938a-59404c33d64c","Type":"ContainerStarted","Data":"a5b1a6f266b58bd931749701bc3f9fc60ad762c2b7f061f1dd7a984e8e1a7502"} Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.299398 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr"] Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.300528 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.302769 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gzck7" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.313760 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr"] Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.362554 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-bundle\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.362620 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-util\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.362688 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq6tx\" (UniqueName: \"kubernetes.io/projected/284bc0b4-fcb8-4b80-94f1-0b232de6684f-kube-api-access-kq6tx\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.463731 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq6tx\" (UniqueName: \"kubernetes.io/projected/284bc0b4-fcb8-4b80-94f1-0b232de6684f-kube-api-access-kq6tx\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.463843 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-bundle\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.463897 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-util\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.464468 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-bundle\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.464493 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-util\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.496973 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq6tx\" (UniqueName: \"kubernetes.io/projected/284bc0b4-fcb8-4b80-94f1-0b232de6684f-kube-api-access-kq6tx\") pod \"2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:51 crc kubenswrapper[4819]: I0228 03:52:51.617502 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:52 crc kubenswrapper[4819]: I0228 03:52:52.084703 4819 generic.go:334] "Generic (PLEG): container finished" podID="c6f29299-f69a-4e2c-938a-59404c33d64c" containerID="0c40aeb40f2ea24011ca9b9c8b48dab39bd8f56463c0d5dd1ec35a265f08cf2b" exitCode=0 Feb 28 03:52:52 crc kubenswrapper[4819]: I0228 03:52:52.085135 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-lgl97" event={"ID":"c6f29299-f69a-4e2c-938a-59404c33d64c","Type":"ContainerDied","Data":"0c40aeb40f2ea24011ca9b9c8b48dab39bd8f56463c0d5dd1ec35a265f08cf2b"} Feb 28 03:52:52 crc kubenswrapper[4819]: I0228 03:52:52.090287 4819 generic.go:334] "Generic (PLEG): container finished" podID="7c428fea-2d2c-4e5c-9244-8eedf6cae97f" containerID="771928f415d984c36b7e15323907df7846b3812dfbcfb52e97202445b058d1b9" exitCode=0 Feb 28 03:52:52 crc kubenswrapper[4819]: I0228 03:52:52.090513 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" event={"ID":"7c428fea-2d2c-4e5c-9244-8eedf6cae97f","Type":"ContainerDied","Data":"771928f415d984c36b7e15323907df7846b3812dfbcfb52e97202445b058d1b9"} Feb 28 03:52:52 crc kubenswrapper[4819]: I0228 03:52:52.098593 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr"] Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.102390 4819 generic.go:334] "Generic (PLEG): container finished" podID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerID="f71cb2629aa563d677a945ce7654c5438c5fa8a6e777d0f318dfda79eaef8269" exitCode=0 Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.102523 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" event={"ID":"284bc0b4-fcb8-4b80-94f1-0b232de6684f","Type":"ContainerDied","Data":"f71cb2629aa563d677a945ce7654c5438c5fa8a6e777d0f318dfda79eaef8269"} Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.102641 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" event={"ID":"284bc0b4-fcb8-4b80-94f1-0b232de6684f","Type":"ContainerStarted","Data":"f6fa41501b4f7b97d5f797257c635d0b263dc239472bb2e2ef9255948f1e02cd"} Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.486839 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.576128 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.596387 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-operator-scripts\") pod \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.596793 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24f9c\" (UniqueName: \"kubernetes.io/projected/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-kube-api-access-24f9c\") pod \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\" (UID: \"7c428fea-2d2c-4e5c-9244-8eedf6cae97f\") " Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.597312 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c428fea-2d2c-4e5c-9244-8eedf6cae97f" (UID: "7c428fea-2d2c-4e5c-9244-8eedf6cae97f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.604242 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-kube-api-access-24f9c" (OuterVolumeSpecName: "kube-api-access-24f9c") pod "7c428fea-2d2c-4e5c-9244-8eedf6cae97f" (UID: "7c428fea-2d2c-4e5c-9244-8eedf6cae97f"). InnerVolumeSpecName "kube-api-access-24f9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.698312 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjqws\" (UniqueName: \"kubernetes.io/projected/c6f29299-f69a-4e2c-938a-59404c33d64c-kube-api-access-xjqws\") pod \"c6f29299-f69a-4e2c-938a-59404c33d64c\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.698413 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6f29299-f69a-4e2c-938a-59404c33d64c-operator-scripts\") pod \"c6f29299-f69a-4e2c-938a-59404c33d64c\" (UID: \"c6f29299-f69a-4e2c-938a-59404c33d64c\") " Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.698769 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24f9c\" (UniqueName: \"kubernetes.io/projected/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-kube-api-access-24f9c\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.698796 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c428fea-2d2c-4e5c-9244-8eedf6cae97f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.699311 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6f29299-f69a-4e2c-938a-59404c33d64c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6f29299-f69a-4e2c-938a-59404c33d64c" (UID: "c6f29299-f69a-4e2c-938a-59404c33d64c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.702326 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f29299-f69a-4e2c-938a-59404c33d64c-kube-api-access-xjqws" (OuterVolumeSpecName: "kube-api-access-xjqws") pod "c6f29299-f69a-4e2c-938a-59404c33d64c" (UID: "c6f29299-f69a-4e2c-938a-59404c33d64c"). InnerVolumeSpecName "kube-api-access-xjqws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.799797 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjqws\" (UniqueName: \"kubernetes.io/projected/c6f29299-f69a-4e2c-938a-59404c33d64c-kube-api-access-xjqws\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:53 crc kubenswrapper[4819]: I0228 03:52:53.799835 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6f29299-f69a-4e2c-938a-59404c33d64c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.109722 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-lgl97" event={"ID":"c6f29299-f69a-4e2c-938a-59404c33d64c","Type":"ContainerDied","Data":"a5b1a6f266b58bd931749701bc3f9fc60ad762c2b7f061f1dd7a984e8e1a7502"} Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.109766 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5b1a6f266b58bd931749701bc3f9fc60ad762c2b7f061f1dd7a984e8e1a7502" Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.109789 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-lgl97" Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.111519 4819 generic.go:334] "Generic (PLEG): container finished" podID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerID="951482b6a05666a3b0d20aaba57776494dceb97af16580d34ac6a390ef29b346" exitCode=0 Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.111611 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" event={"ID":"284bc0b4-fcb8-4b80-94f1-0b232de6684f","Type":"ContainerDied","Data":"951482b6a05666a3b0d20aaba57776494dceb97af16580d34ac6a390ef29b346"} Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.113805 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" event={"ID":"7c428fea-2d2c-4e5c-9244-8eedf6cae97f","Type":"ContainerDied","Data":"d844c0bd32b0a3315ae9dce1ed3c2b4c6a303e713fdba7c8a8ec2a08cb16f1d4"} Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.113837 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d844c0bd32b0a3315ae9dce1ed3c2b4c6a303e713fdba7c8a8ec2a08cb16f1d4" Feb 28 03:52:54 crc kubenswrapper[4819]: I0228 03:52:54.113889 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.123031 4819 generic.go:334] "Generic (PLEG): container finished" podID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerID="c7a251fbf38750e5bf0f47cd64dc4b60946989fe9dd894b679233003028060c7" exitCode=0 Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.123089 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" event={"ID":"284bc0b4-fcb8-4b80-94f1-0b232de6684f","Type":"ContainerDied","Data":"c7a251fbf38750e5bf0f47cd64dc4b60946989fe9dd894b679233003028060c7"} Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.240907 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-lm96w"] Feb 28 03:52:55 crc kubenswrapper[4819]: E0228 03:52:55.241216 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c428fea-2d2c-4e5c-9244-8eedf6cae97f" containerName="mariadb-account-create-update" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.241296 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c428fea-2d2c-4e5c-9244-8eedf6cae97f" containerName="mariadb-account-create-update" Feb 28 03:52:55 crc kubenswrapper[4819]: E0228 03:52:55.241335 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f29299-f69a-4e2c-938a-59404c33d64c" containerName="mariadb-database-create" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.241344 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f29299-f69a-4e2c-938a-59404c33d64c" containerName="mariadb-database-create" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.241479 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c428fea-2d2c-4e5c-9244-8eedf6cae97f" containerName="mariadb-account-create-update" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.241497 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f29299-f69a-4e2c-938a-59404c33d64c" containerName="mariadb-database-create" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.242018 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.246292 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-ff8xl" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.246356 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.246474 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.248053 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.259979 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-lm96w"] Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.319895 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jsgc\" (UniqueName: \"kubernetes.io/projected/a6feb748-e921-4c51-992d-0a07a5afd987-kube-api-access-4jsgc\") pod \"keystone-db-sync-lm96w\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.319953 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6feb748-e921-4c51-992d-0a07a5afd987-config-data\") pod \"keystone-db-sync-lm96w\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.421221 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jsgc\" (UniqueName: \"kubernetes.io/projected/a6feb748-e921-4c51-992d-0a07a5afd987-kube-api-access-4jsgc\") pod \"keystone-db-sync-lm96w\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.421396 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6feb748-e921-4c51-992d-0a07a5afd987-config-data\") pod \"keystone-db-sync-lm96w\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.434016 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6feb748-e921-4c51-992d-0a07a5afd987-config-data\") pod \"keystone-db-sync-lm96w\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.450104 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jsgc\" (UniqueName: \"kubernetes.io/projected/a6feb748-e921-4c51-992d-0a07a5afd987-kube-api-access-4jsgc\") pod \"keystone-db-sync-lm96w\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.566800 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:52:55 crc kubenswrapper[4819]: I0228 03:52:55.787535 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-lm96w"] Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.134227 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" event={"ID":"a6feb748-e921-4c51-992d-0a07a5afd987","Type":"ContainerStarted","Data":"ccf3368bc5cd6e1a8712e6854a072c6689bb202b74eadddb1662de7dfd650179"} Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.543515 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.641080 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-util\") pod \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.641177 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-bundle\") pod \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.641208 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq6tx\" (UniqueName: \"kubernetes.io/projected/284bc0b4-fcb8-4b80-94f1-0b232de6684f-kube-api-access-kq6tx\") pod \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\" (UID: \"284bc0b4-fcb8-4b80-94f1-0b232de6684f\") " Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.642859 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-bundle" (OuterVolumeSpecName: "bundle") pod "284bc0b4-fcb8-4b80-94f1-0b232de6684f" (UID: "284bc0b4-fcb8-4b80-94f1-0b232de6684f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.648510 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284bc0b4-fcb8-4b80-94f1-0b232de6684f-kube-api-access-kq6tx" (OuterVolumeSpecName: "kube-api-access-kq6tx") pod "284bc0b4-fcb8-4b80-94f1-0b232de6684f" (UID: "284bc0b4-fcb8-4b80-94f1-0b232de6684f"). InnerVolumeSpecName "kube-api-access-kq6tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.671193 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-util" (OuterVolumeSpecName: "util") pod "284bc0b4-fcb8-4b80-94f1-0b232de6684f" (UID: "284bc0b4-fcb8-4b80-94f1-0b232de6684f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.742967 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq6tx\" (UniqueName: \"kubernetes.io/projected/284bc0b4-fcb8-4b80-94f1-0b232de6684f-kube-api-access-kq6tx\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.743042 4819 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-util\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:56 crc kubenswrapper[4819]: I0228 03:52:56.743071 4819 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/284bc0b4-fcb8-4b80-94f1-0b232de6684f-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:52:57 crc kubenswrapper[4819]: I0228 03:52:57.148723 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" event={"ID":"284bc0b4-fcb8-4b80-94f1-0b232de6684f","Type":"ContainerDied","Data":"f6fa41501b4f7b97d5f797257c635d0b263dc239472bb2e2ef9255948f1e02cd"} Feb 28 03:52:57 crc kubenswrapper[4819]: I0228 03:52:57.148808 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr" Feb 28 03:52:57 crc kubenswrapper[4819]: I0228 03:52:57.148841 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6fa41501b4f7b97d5f797257c635d0b263dc239472bb2e2ef9255948f1e02cd" Feb 28 03:53:00 crc kubenswrapper[4819]: I0228 03:53:00.833681 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:53:00 crc kubenswrapper[4819]: I0228 03:53:00.834225 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:53:05 crc kubenswrapper[4819]: I0228 03:53:05.246608 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" event={"ID":"a6feb748-e921-4c51-992d-0a07a5afd987","Type":"ContainerStarted","Data":"24a30d737b8b0ab6e5d840797bc62d550feab4bc0c06489fd4f821811a692eae"} Feb 28 03:53:05 crc kubenswrapper[4819]: I0228 03:53:05.264538 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" podStartSLOduration=1.8070116569999999 podStartE2EDuration="10.26451637s" podCreationTimestamp="2026-02-28 03:52:55 +0000 UTC" firstStartedPulling="2026-02-28 03:52:55.796766607 +0000 UTC m=+1114.262335455" lastFinishedPulling="2026-02-28 03:53:04.25427131 +0000 UTC m=+1122.719840168" observedRunningTime="2026-02-28 03:53:05.260587883 +0000 UTC m=+1123.726156771" watchObservedRunningTime="2026-02-28 03:53:05.26451637 +0000 UTC m=+1123.730085228" Feb 28 03:53:09 crc kubenswrapper[4819]: I0228 03:53:09.277480 4819 generic.go:334] "Generic (PLEG): container finished" podID="a6feb748-e921-4c51-992d-0a07a5afd987" containerID="24a30d737b8b0ab6e5d840797bc62d550feab4bc0c06489fd4f821811a692eae" exitCode=0 Feb 28 03:53:09 crc kubenswrapper[4819]: I0228 03:53:09.277596 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" event={"ID":"a6feb748-e921-4c51-992d-0a07a5afd987","Type":"ContainerDied","Data":"24a30d737b8b0ab6e5d840797bc62d550feab4bc0c06489fd4f821811a692eae"} Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.011116 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd"] Feb 28 03:53:10 crc kubenswrapper[4819]: E0228 03:53:10.011459 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="util" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.011480 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="util" Feb 28 03:53:10 crc kubenswrapper[4819]: E0228 03:53:10.011501 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="extract" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.011509 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="extract" Feb 28 03:53:10 crc kubenswrapper[4819]: E0228 03:53:10.011523 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="pull" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.011530 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="pull" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.011675 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" containerName="extract" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.012158 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.015317 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.016396 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5swsb" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.029192 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd"] Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.152856 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-apiservice-cert\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.153208 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ghfn\" (UniqueName: \"kubernetes.io/projected/b49fd157-5376-4da3-8d0d-11a9218ce42b-kube-api-access-4ghfn\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.153261 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-webhook-cert\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.254435 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-apiservice-cert\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.254546 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ghfn\" (UniqueName: \"kubernetes.io/projected/b49fd157-5376-4da3-8d0d-11a9218ce42b-kube-api-access-4ghfn\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.254620 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-webhook-cert\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.263728 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-apiservice-cert\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.264339 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-webhook-cert\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.272071 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ghfn\" (UniqueName: \"kubernetes.io/projected/b49fd157-5376-4da3-8d0d-11a9218ce42b-kube-api-access-4ghfn\") pod \"barbican-operator-controller-manager-55cbccd744-wl4fd\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.330394 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.567913 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.761571 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jsgc\" (UniqueName: \"kubernetes.io/projected/a6feb748-e921-4c51-992d-0a07a5afd987-kube-api-access-4jsgc\") pod \"a6feb748-e921-4c51-992d-0a07a5afd987\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.761722 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6feb748-e921-4c51-992d-0a07a5afd987-config-data\") pod \"a6feb748-e921-4c51-992d-0a07a5afd987\" (UID: \"a6feb748-e921-4c51-992d-0a07a5afd987\") " Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.765595 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6feb748-e921-4c51-992d-0a07a5afd987-kube-api-access-4jsgc" (OuterVolumeSpecName: "kube-api-access-4jsgc") pod "a6feb748-e921-4c51-992d-0a07a5afd987" (UID: "a6feb748-e921-4c51-992d-0a07a5afd987"). InnerVolumeSpecName "kube-api-access-4jsgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.794560 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6feb748-e921-4c51-992d-0a07a5afd987-config-data" (OuterVolumeSpecName: "config-data") pod "a6feb748-e921-4c51-992d-0a07a5afd987" (UID: "a6feb748-e921-4c51-992d-0a07a5afd987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.815979 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd"] Feb 28 03:53:10 crc kubenswrapper[4819]: W0228 03:53:10.822255 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49fd157_5376_4da3_8d0d_11a9218ce42b.slice/crio-39946b790522f225db17ce25624cd5d153ba5290337061da44f8e2412253de77 WatchSource:0}: Error finding container 39946b790522f225db17ce25624cd5d153ba5290337061da44f8e2412253de77: Status 404 returned error can't find the container with id 39946b790522f225db17ce25624cd5d153ba5290337061da44f8e2412253de77 Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.863363 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6feb748-e921-4c51-992d-0a07a5afd987-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:10 crc kubenswrapper[4819]: I0228 03:53:10.863395 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jsgc\" (UniqueName: \"kubernetes.io/projected/a6feb748-e921-4c51-992d-0a07a5afd987-kube-api-access-4jsgc\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.295444 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" event={"ID":"a6feb748-e921-4c51-992d-0a07a5afd987","Type":"ContainerDied","Data":"ccf3368bc5cd6e1a8712e6854a072c6689bb202b74eadddb1662de7dfd650179"} Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.295490 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccf3368bc5cd6e1a8712e6854a072c6689bb202b74eadddb1662de7dfd650179" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.295496 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-lm96w" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.297882 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" event={"ID":"b49fd157-5376-4da3-8d0d-11a9218ce42b","Type":"ContainerStarted","Data":"39946b790522f225db17ce25624cd5d153ba5290337061da44f8e2412253de77"} Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.508645 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-p4869"] Feb 28 03:53:11 crc kubenswrapper[4819]: E0228 03:53:11.509095 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6feb748-e921-4c51-992d-0a07a5afd987" containerName="keystone-db-sync" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.509107 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6feb748-e921-4c51-992d-0a07a5afd987" containerName="keystone-db-sync" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.509232 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6feb748-e921-4c51-992d-0a07a5afd987" containerName="keystone-db-sync" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.509678 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.511945 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.512275 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.512803 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-ff8xl" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.513094 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.513269 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"osp-secret" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.529115 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-p4869"] Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.577485 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smd5l\" (UniqueName: \"kubernetes.io/projected/e2700e78-452b-498f-8620-b94999aac328-kube-api-access-smd5l\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.577559 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-fernet-keys\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.577667 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-config-data\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.577699 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-credential-keys\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.577721 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-scripts\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.679174 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smd5l\" (UniqueName: \"kubernetes.io/projected/e2700e78-452b-498f-8620-b94999aac328-kube-api-access-smd5l\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.679279 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-fernet-keys\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.679430 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-config-data\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.679485 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-credential-keys\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.679528 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-scripts\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.684476 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-credential-keys\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.684737 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-config-data\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.686110 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-fernet-keys\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.688403 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-scripts\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.694681 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smd5l\" (UniqueName: \"kubernetes.io/projected/e2700e78-452b-498f-8620-b94999aac328-kube-api-access-smd5l\") pod \"keystone-bootstrap-p4869\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:11 crc kubenswrapper[4819]: I0228 03:53:11.829368 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:12 crc kubenswrapper[4819]: I0228 03:53:12.307164 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" event={"ID":"b49fd157-5376-4da3-8d0d-11a9218ce42b","Type":"ContainerStarted","Data":"8d0499475c8b14ac0ab72990fa6f2c6c91c665fa22b5935f98f254f819afc866"} Feb 28 03:53:12 crc kubenswrapper[4819]: I0228 03:53:12.307531 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:12 crc kubenswrapper[4819]: I0228 03:53:12.324028 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" podStartSLOduration=2.026779695 podStartE2EDuration="3.324012657s" podCreationTimestamp="2026-02-28 03:53:09 +0000 UTC" firstStartedPulling="2026-02-28 03:53:10.824651545 +0000 UTC m=+1129.290220423" lastFinishedPulling="2026-02-28 03:53:12.121884517 +0000 UTC m=+1130.587453385" observedRunningTime="2026-02-28 03:53:12.323415022 +0000 UTC m=+1130.788983880" watchObservedRunningTime="2026-02-28 03:53:12.324012657 +0000 UTC m=+1130.789581515" Feb 28 03:53:12 crc kubenswrapper[4819]: I0228 03:53:12.592847 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-p4869"] Feb 28 03:53:12 crc kubenswrapper[4819]: W0228 03:53:12.599674 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2700e78_452b_498f_8620_b94999aac328.slice/crio-cce877475ec173a41e480cbc396e47c34e2cd3b8dc910b2123f03b82f867f67e WatchSource:0}: Error finding container cce877475ec173a41e480cbc396e47c34e2cd3b8dc910b2123f03b82f867f67e: Status 404 returned error can't find the container with id cce877475ec173a41e480cbc396e47c34e2cd3b8dc910b2123f03b82f867f67e Feb 28 03:53:13 crc kubenswrapper[4819]: I0228 03:53:13.321630 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" event={"ID":"e2700e78-452b-498f-8620-b94999aac328","Type":"ContainerStarted","Data":"455c397d636246c1d36a80721a8b9986b3fec5c926f9cb5a5a8b6b20270cd096"} Feb 28 03:53:13 crc kubenswrapper[4819]: I0228 03:53:13.322023 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" event={"ID":"e2700e78-452b-498f-8620-b94999aac328","Type":"ContainerStarted","Data":"cce877475ec173a41e480cbc396e47c34e2cd3b8dc910b2123f03b82f867f67e"} Feb 28 03:53:13 crc kubenswrapper[4819]: I0228 03:53:13.345534 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" podStartSLOduration=2.345512736 podStartE2EDuration="2.345512736s" podCreationTimestamp="2026-02-28 03:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:53:13.341127058 +0000 UTC m=+1131.806695956" watchObservedRunningTime="2026-02-28 03:53:13.345512736 +0000 UTC m=+1131.811081624" Feb 28 03:53:16 crc kubenswrapper[4819]: I0228 03:53:16.349718 4819 generic.go:334] "Generic (PLEG): container finished" podID="e2700e78-452b-498f-8620-b94999aac328" containerID="455c397d636246c1d36a80721a8b9986b3fec5c926f9cb5a5a8b6b20270cd096" exitCode=0 Feb 28 03:53:16 crc kubenswrapper[4819]: I0228 03:53:16.349767 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" event={"ID":"e2700e78-452b-498f-8620-b94999aac328","Type":"ContainerDied","Data":"455c397d636246c1d36a80721a8b9986b3fec5c926f9cb5a5a8b6b20270cd096"} Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.650719 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.692481 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-config-data\") pod \"e2700e78-452b-498f-8620-b94999aac328\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.692557 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smd5l\" (UniqueName: \"kubernetes.io/projected/e2700e78-452b-498f-8620-b94999aac328-kube-api-access-smd5l\") pod \"e2700e78-452b-498f-8620-b94999aac328\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.692609 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-scripts\") pod \"e2700e78-452b-498f-8620-b94999aac328\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.692638 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-fernet-keys\") pod \"e2700e78-452b-498f-8620-b94999aac328\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.692662 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-credential-keys\") pod \"e2700e78-452b-498f-8620-b94999aac328\" (UID: \"e2700e78-452b-498f-8620-b94999aac328\") " Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.701897 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e2700e78-452b-498f-8620-b94999aac328" (UID: "e2700e78-452b-498f-8620-b94999aac328"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.702231 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e2700e78-452b-498f-8620-b94999aac328" (UID: "e2700e78-452b-498f-8620-b94999aac328"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.709048 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-scripts" (OuterVolumeSpecName: "scripts") pod "e2700e78-452b-498f-8620-b94999aac328" (UID: "e2700e78-452b-498f-8620-b94999aac328"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.710033 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2700e78-452b-498f-8620-b94999aac328-kube-api-access-smd5l" (OuterVolumeSpecName: "kube-api-access-smd5l") pod "e2700e78-452b-498f-8620-b94999aac328" (UID: "e2700e78-452b-498f-8620-b94999aac328"). InnerVolumeSpecName "kube-api-access-smd5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.732046 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-config-data" (OuterVolumeSpecName: "config-data") pod "e2700e78-452b-498f-8620-b94999aac328" (UID: "e2700e78-452b-498f-8620-b94999aac328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.794532 4819 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.794568 4819 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.794580 4819 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.794591 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2700e78-452b-498f-8620-b94999aac328-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:17 crc kubenswrapper[4819]: I0228 03:53:17.794600 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smd5l\" (UniqueName: \"kubernetes.io/projected/e2700e78-452b-498f-8620-b94999aac328-kube-api-access-smd5l\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.367330 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" event={"ID":"e2700e78-452b-498f-8620-b94999aac328","Type":"ContainerDied","Data":"cce877475ec173a41e480cbc396e47c34e2cd3b8dc910b2123f03b82f867f67e"} Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.367379 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce877475ec173a41e480cbc396e47c34e2cd3b8dc910b2123f03b82f867f67e" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.367436 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-p4869" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.445306 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v"] Feb 28 03:53:18 crc kubenswrapper[4819]: E0228 03:53:18.445582 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2700e78-452b-498f-8620-b94999aac328" containerName="keystone-bootstrap" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.445602 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2700e78-452b-498f-8620-b94999aac328" containerName="keystone-bootstrap" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.445768 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2700e78-452b-498f-8620-b94999aac328" containerName="keystone-bootstrap" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.446318 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.451792 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.451982 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.451809 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.453956 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-ff8xl" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.457952 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v"] Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.604765 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr465\" (UniqueName: \"kubernetes.io/projected/c66de22e-9900-45fe-b074-455829b4084a-kube-api-access-pr465\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.604877 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-credential-keys\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.604918 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-scripts\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.604946 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-fernet-keys\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.604977 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-config-data\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.707755 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-credential-keys\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.707853 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-scripts\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.707920 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-fernet-keys\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.707992 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-config-data\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.708089 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr465\" (UniqueName: \"kubernetes.io/projected/c66de22e-9900-45fe-b074-455829b4084a-kube-api-access-pr465\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.712497 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-scripts\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.712771 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-credential-keys\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.713729 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-fernet-keys\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.716656 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-config-data\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.735301 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr465\" (UniqueName: \"kubernetes.io/projected/c66de22e-9900-45fe-b074-455829b4084a-kube-api-access-pr465\") pod \"keystone-bf7f56bd7-gdd5v\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:18 crc kubenswrapper[4819]: I0228 03:53:18.761466 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:19 crc kubenswrapper[4819]: I0228 03:53:19.056519 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v"] Feb 28 03:53:19 crc kubenswrapper[4819]: I0228 03:53:19.375938 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" event={"ID":"c66de22e-9900-45fe-b074-455829b4084a","Type":"ContainerStarted","Data":"f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259"} Feb 28 03:53:19 crc kubenswrapper[4819]: I0228 03:53:19.376761 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:19 crc kubenswrapper[4819]: I0228 03:53:19.376848 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" event={"ID":"c66de22e-9900-45fe-b074-455829b4084a","Type":"ContainerStarted","Data":"2e00e8496511b46b87434c0377b4dc040a9cbb399c62cfa317b8a43d7f9aa06b"} Feb 28 03:53:19 crc kubenswrapper[4819]: I0228 03:53:19.402372 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" podStartSLOduration=1.402354758 podStartE2EDuration="1.402354758s" podCreationTimestamp="2026-02-28 03:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:53:19.399833646 +0000 UTC m=+1137.865402544" watchObservedRunningTime="2026-02-28 03:53:19.402354758 +0000 UTC m=+1137.867923616" Feb 28 03:53:20 crc kubenswrapper[4819]: I0228 03:53:20.337206 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.569504 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zf659"] Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.571012 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.593871 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zrgg\" (UniqueName: \"kubernetes.io/projected/13e68315-57a9-416c-9c98-83e30d943815-kube-api-access-5zrgg\") pod \"barbican-db-create-zf659\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.594103 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13e68315-57a9-416c-9c98-83e30d943815-operator-scripts\") pod \"barbican-db-create-zf659\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.598809 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt"] Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.599667 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.602746 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.607378 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zf659"] Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.621442 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt"] Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.695118 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13e68315-57a9-416c-9c98-83e30d943815-operator-scripts\") pod \"barbican-db-create-zf659\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.696074 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13e68315-57a9-416c-9c98-83e30d943815-operator-scripts\") pod \"barbican-db-create-zf659\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.696224 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zrgg\" (UniqueName: \"kubernetes.io/projected/13e68315-57a9-416c-9c98-83e30d943815-kube-api-access-5zrgg\") pod \"barbican-db-create-zf659\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.696325 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4427ba01-e5bc-4dc0-a501-47576384ba41-operator-scripts\") pod \"barbican-0662-account-create-update-zcwnt\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.696438 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnzk\" (UniqueName: \"kubernetes.io/projected/4427ba01-e5bc-4dc0-a501-47576384ba41-kube-api-access-xbnzk\") pod \"barbican-0662-account-create-update-zcwnt\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.728800 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zrgg\" (UniqueName: \"kubernetes.io/projected/13e68315-57a9-416c-9c98-83e30d943815-kube-api-access-5zrgg\") pod \"barbican-db-create-zf659\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.798411 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4427ba01-e5bc-4dc0-a501-47576384ba41-operator-scripts\") pod \"barbican-0662-account-create-update-zcwnt\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.798462 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnzk\" (UniqueName: \"kubernetes.io/projected/4427ba01-e5bc-4dc0-a501-47576384ba41-kube-api-access-xbnzk\") pod \"barbican-0662-account-create-update-zcwnt\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.799526 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4427ba01-e5bc-4dc0-a501-47576384ba41-operator-scripts\") pod \"barbican-0662-account-create-update-zcwnt\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.815184 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnzk\" (UniqueName: \"kubernetes.io/projected/4427ba01-e5bc-4dc0-a501-47576384ba41-kube-api-access-xbnzk\") pod \"barbican-0662-account-create-update-zcwnt\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.891892 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:24 crc kubenswrapper[4819]: I0228 03:53:24.918765 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:25 crc kubenswrapper[4819]: I0228 03:53:25.225923 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt"] Feb 28 03:53:25 crc kubenswrapper[4819]: I0228 03:53:25.358767 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zf659"] Feb 28 03:53:25 crc kubenswrapper[4819]: W0228 03:53:25.360261 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13e68315_57a9_416c_9c98_83e30d943815.slice/crio-4fae63de70f9d5f2d34dbd8a9d2b54456c0070a0cbf5b204aea37103bcdc37b7 WatchSource:0}: Error finding container 4fae63de70f9d5f2d34dbd8a9d2b54456c0070a0cbf5b204aea37103bcdc37b7: Status 404 returned error can't find the container with id 4fae63de70f9d5f2d34dbd8a9d2b54456c0070a0cbf5b204aea37103bcdc37b7 Feb 28 03:53:25 crc kubenswrapper[4819]: I0228 03:53:25.435766 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-zf659" event={"ID":"13e68315-57a9-416c-9c98-83e30d943815","Type":"ContainerStarted","Data":"4fae63de70f9d5f2d34dbd8a9d2b54456c0070a0cbf5b204aea37103bcdc37b7"} Feb 28 03:53:25 crc kubenswrapper[4819]: I0228 03:53:25.437373 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" event={"ID":"4427ba01-e5bc-4dc0-a501-47576384ba41","Type":"ContainerStarted","Data":"ca4e90c95af3abe85a0867527182d2e377f6488de6fb00113a1103be905a49d4"} Feb 28 03:53:25 crc kubenswrapper[4819]: I0228 03:53:25.437402 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" event={"ID":"4427ba01-e5bc-4dc0-a501-47576384ba41","Type":"ContainerStarted","Data":"0b3799dc0966249d05359ef7d8dac9c390a95d99ccf3a03d1973a681e84769b5"} Feb 28 03:53:26 crc kubenswrapper[4819]: I0228 03:53:26.448328 4819 generic.go:334] "Generic (PLEG): container finished" podID="13e68315-57a9-416c-9c98-83e30d943815" containerID="37b4c5862d7467856fa9d6a22198a9df342f05267729e82021b04d502c72f8c5" exitCode=0 Feb 28 03:53:26 crc kubenswrapper[4819]: I0228 03:53:26.448465 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-zf659" event={"ID":"13e68315-57a9-416c-9c98-83e30d943815","Type":"ContainerDied","Data":"37b4c5862d7467856fa9d6a22198a9df342f05267729e82021b04d502c72f8c5"} Feb 28 03:53:26 crc kubenswrapper[4819]: I0228 03:53:26.450571 4819 generic.go:334] "Generic (PLEG): container finished" podID="4427ba01-e5bc-4dc0-a501-47576384ba41" containerID="ca4e90c95af3abe85a0867527182d2e377f6488de6fb00113a1103be905a49d4" exitCode=0 Feb 28 03:53:26 crc kubenswrapper[4819]: I0228 03:53:26.450642 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" event={"ID":"4427ba01-e5bc-4dc0-a501-47576384ba41","Type":"ContainerDied","Data":"ca4e90c95af3abe85a0867527182d2e377f6488de6fb00113a1103be905a49d4"} Feb 28 03:53:26 crc kubenswrapper[4819]: I0228 03:53:26.474023 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" podStartSLOduration=2.473998045 podStartE2EDuration="2.473998045s" podCreationTimestamp="2026-02-28 03:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:53:25.45303263 +0000 UTC m=+1143.918601498" watchObservedRunningTime="2026-02-28 03:53:26.473998045 +0000 UTC m=+1144.939566943" Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.895196 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.898709 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.941164 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zrgg\" (UniqueName: \"kubernetes.io/projected/13e68315-57a9-416c-9c98-83e30d943815-kube-api-access-5zrgg\") pod \"13e68315-57a9-416c-9c98-83e30d943815\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.941265 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4427ba01-e5bc-4dc0-a501-47576384ba41-operator-scripts\") pod \"4427ba01-e5bc-4dc0-a501-47576384ba41\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.941291 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbnzk\" (UniqueName: \"kubernetes.io/projected/4427ba01-e5bc-4dc0-a501-47576384ba41-kube-api-access-xbnzk\") pod \"4427ba01-e5bc-4dc0-a501-47576384ba41\" (UID: \"4427ba01-e5bc-4dc0-a501-47576384ba41\") " Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.941392 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13e68315-57a9-416c-9c98-83e30d943815-operator-scripts\") pod \"13e68315-57a9-416c-9c98-83e30d943815\" (UID: \"13e68315-57a9-416c-9c98-83e30d943815\") " Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.942381 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e68315-57a9-416c-9c98-83e30d943815-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13e68315-57a9-416c-9c98-83e30d943815" (UID: "13e68315-57a9-416c-9c98-83e30d943815"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.943362 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4427ba01-e5bc-4dc0-a501-47576384ba41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4427ba01-e5bc-4dc0-a501-47576384ba41" (UID: "4427ba01-e5bc-4dc0-a501-47576384ba41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.948335 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e68315-57a9-416c-9c98-83e30d943815-kube-api-access-5zrgg" (OuterVolumeSpecName: "kube-api-access-5zrgg") pod "13e68315-57a9-416c-9c98-83e30d943815" (UID: "13e68315-57a9-416c-9c98-83e30d943815"). InnerVolumeSpecName "kube-api-access-5zrgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:27 crc kubenswrapper[4819]: I0228 03:53:27.948556 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4427ba01-e5bc-4dc0-a501-47576384ba41-kube-api-access-xbnzk" (OuterVolumeSpecName: "kube-api-access-xbnzk") pod "4427ba01-e5bc-4dc0-a501-47576384ba41" (UID: "4427ba01-e5bc-4dc0-a501-47576384ba41"). InnerVolumeSpecName "kube-api-access-xbnzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.043453 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13e68315-57a9-416c-9c98-83e30d943815-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.043501 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zrgg\" (UniqueName: \"kubernetes.io/projected/13e68315-57a9-416c-9c98-83e30d943815-kube-api-access-5zrgg\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.043522 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4427ba01-e5bc-4dc0-a501-47576384ba41-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.043540 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbnzk\" (UniqueName: \"kubernetes.io/projected/4427ba01-e5bc-4dc0-a501-47576384ba41-kube-api-access-xbnzk\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.471061 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-zf659" event={"ID":"13e68315-57a9-416c-9c98-83e30d943815","Type":"ContainerDied","Data":"4fae63de70f9d5f2d34dbd8a9d2b54456c0070a0cbf5b204aea37103bcdc37b7"} Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.471108 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fae63de70f9d5f2d34dbd8a9d2b54456c0070a0cbf5b204aea37103bcdc37b7" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.471080 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-zf659" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.473462 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" event={"ID":"4427ba01-e5bc-4dc0-a501-47576384ba41","Type":"ContainerDied","Data":"0b3799dc0966249d05359ef7d8dac9c390a95d99ccf3a03d1973a681e84769b5"} Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.473516 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3799dc0966249d05359ef7d8dac9c390a95d99ccf3a03d1973a681e84769b5" Feb 28 03:53:28 crc kubenswrapper[4819]: I0228 03:53:28.473489 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.897327 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fcrsq"] Feb 28 03:53:29 crc kubenswrapper[4819]: E0228 03:53:29.898165 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e68315-57a9-416c-9c98-83e30d943815" containerName="mariadb-database-create" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.898197 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e68315-57a9-416c-9c98-83e30d943815" containerName="mariadb-database-create" Feb 28 03:53:29 crc kubenswrapper[4819]: E0228 03:53:29.898231 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4427ba01-e5bc-4dc0-a501-47576384ba41" containerName="mariadb-account-create-update" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.898282 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="4427ba01-e5bc-4dc0-a501-47576384ba41" containerName="mariadb-account-create-update" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.898565 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="4427ba01-e5bc-4dc0-a501-47576384ba41" containerName="mariadb-account-create-update" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.898610 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e68315-57a9-416c-9c98-83e30d943815" containerName="mariadb-database-create" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.899857 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.901783 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-7928s" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.903474 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Feb 28 03:53:29 crc kubenswrapper[4819]: I0228 03:53:29.910667 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fcrsq"] Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.077457 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ztt\" (UniqueName: \"kubernetes.io/projected/c8c43a99-ce90-404f-bc48-191bd422fced-kube-api-access-54ztt\") pod \"barbican-db-sync-fcrsq\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.077702 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8c43a99-ce90-404f-bc48-191bd422fced-db-sync-config-data\") pod \"barbican-db-sync-fcrsq\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.179520 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8c43a99-ce90-404f-bc48-191bd422fced-db-sync-config-data\") pod \"barbican-db-sync-fcrsq\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.179719 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ztt\" (UniqueName: \"kubernetes.io/projected/c8c43a99-ce90-404f-bc48-191bd422fced-kube-api-access-54ztt\") pod \"barbican-db-sync-fcrsq\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.188805 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8c43a99-ce90-404f-bc48-191bd422fced-db-sync-config-data\") pod \"barbican-db-sync-fcrsq\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.208915 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ztt\" (UniqueName: \"kubernetes.io/projected/c8c43a99-ce90-404f-bc48-191bd422fced-kube-api-access-54ztt\") pod \"barbican-db-sync-fcrsq\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.227170 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:30 crc kubenswrapper[4819]: W0228 03:53:30.722615 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c43a99_ce90_404f_bc48_191bd422fced.slice/crio-7bda9d720560ff26bf978b67910bcff13a9655315e7395842bf04a5d369aff93 WatchSource:0}: Error finding container 7bda9d720560ff26bf978b67910bcff13a9655315e7395842bf04a5d369aff93: Status 404 returned error can't find the container with id 7bda9d720560ff26bf978b67910bcff13a9655315e7395842bf04a5d369aff93 Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.723116 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fcrsq"] Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.834272 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.834351 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.834418 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.835319 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2be5a3de849f4caa81a3f6eb2371d580108119159dd0203e877d29c0441c1708"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 03:53:30 crc kubenswrapper[4819]: I0228 03:53:30.835408 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://2be5a3de849f4caa81a3f6eb2371d580108119159dd0203e877d29c0441c1708" gracePeriod=600 Feb 28 03:53:31 crc kubenswrapper[4819]: I0228 03:53:31.496822 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" event={"ID":"c8c43a99-ce90-404f-bc48-191bd422fced","Type":"ContainerStarted","Data":"7bda9d720560ff26bf978b67910bcff13a9655315e7395842bf04a5d369aff93"} Feb 28 03:53:31 crc kubenswrapper[4819]: I0228 03:53:31.500001 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="2be5a3de849f4caa81a3f6eb2371d580108119159dd0203e877d29c0441c1708" exitCode=0 Feb 28 03:53:31 crc kubenswrapper[4819]: I0228 03:53:31.500072 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"2be5a3de849f4caa81a3f6eb2371d580108119159dd0203e877d29c0441c1708"} Feb 28 03:53:31 crc kubenswrapper[4819]: I0228 03:53:31.500329 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101"} Feb 28 03:53:31 crc kubenswrapper[4819]: I0228 03:53:31.500354 4819 scope.go:117] "RemoveContainer" containerID="5ab93bb2251f8a9fb9c9db9bc6189036f7bfbd545e0f1b6f246a96c7b8188206" Feb 28 03:53:36 crc kubenswrapper[4819]: I0228 03:53:36.547356 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" event={"ID":"c8c43a99-ce90-404f-bc48-191bd422fced","Type":"ContainerStarted","Data":"e946fea03afa6470a8097019c351aa8266932a3e0ece422234188c8f8e4a7bbc"} Feb 28 03:53:39 crc kubenswrapper[4819]: I0228 03:53:39.577397 4819 generic.go:334] "Generic (PLEG): container finished" podID="c8c43a99-ce90-404f-bc48-191bd422fced" containerID="e946fea03afa6470a8097019c351aa8266932a3e0ece422234188c8f8e4a7bbc" exitCode=0 Feb 28 03:53:39 crc kubenswrapper[4819]: I0228 03:53:39.577479 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" event={"ID":"c8c43a99-ce90-404f-bc48-191bd422fced","Type":"ContainerDied","Data":"e946fea03afa6470a8097019c351aa8266932a3e0ece422234188c8f8e4a7bbc"} Feb 28 03:53:40 crc kubenswrapper[4819]: I0228 03:53:40.974873 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.072060 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54ztt\" (UniqueName: \"kubernetes.io/projected/c8c43a99-ce90-404f-bc48-191bd422fced-kube-api-access-54ztt\") pod \"c8c43a99-ce90-404f-bc48-191bd422fced\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.072275 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8c43a99-ce90-404f-bc48-191bd422fced-db-sync-config-data\") pod \"c8c43a99-ce90-404f-bc48-191bd422fced\" (UID: \"c8c43a99-ce90-404f-bc48-191bd422fced\") " Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.081508 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8c43a99-ce90-404f-bc48-191bd422fced-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c8c43a99-ce90-404f-bc48-191bd422fced" (UID: "c8c43a99-ce90-404f-bc48-191bd422fced"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.081723 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c43a99-ce90-404f-bc48-191bd422fced-kube-api-access-54ztt" (OuterVolumeSpecName: "kube-api-access-54ztt") pod "c8c43a99-ce90-404f-bc48-191bd422fced" (UID: "c8c43a99-ce90-404f-bc48-191bd422fced"). InnerVolumeSpecName "kube-api-access-54ztt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.174455 4819 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8c43a99-ce90-404f-bc48-191bd422fced-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.174521 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54ztt\" (UniqueName: \"kubernetes.io/projected/c8c43a99-ce90-404f-bc48-191bd422fced-kube-api-access-54ztt\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.609941 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" event={"ID":"c8c43a99-ce90-404f-bc48-191bd422fced","Type":"ContainerDied","Data":"7bda9d720560ff26bf978b67910bcff13a9655315e7395842bf04a5d369aff93"} Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.609998 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bda9d720560ff26bf978b67910bcff13a9655315e7395842bf04a5d369aff93" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.610053 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fcrsq" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.827727 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9"] Feb 28 03:53:41 crc kubenswrapper[4819]: E0228 03:53:41.828074 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c43a99-ce90-404f-bc48-191bd422fced" containerName="barbican-db-sync" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.828098 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c43a99-ce90-404f-bc48-191bd422fced" containerName="barbican-db-sync" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.828280 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c43a99-ce90-404f-bc48-191bd422fced" containerName="barbican-db-sync" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.829104 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.832550 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.834523 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-7928s" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.834754 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-keystone-listener-config-data" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.838394 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8"] Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.839513 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.842606 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.850296 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9"] Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.858118 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8"] Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985220 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985331 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8x2v\" (UniqueName: \"kubernetes.io/projected/55cdaae2-d1fa-4630-a612-1f1418a0a640-kube-api-access-w8x2v\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985489 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cdaae2-d1fa-4630-a612-1f1418a0a640-logs\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985642 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptzc8\" (UniqueName: \"kubernetes.io/projected/1ef2690e-bd56-4516-8e34-24d5b6299a11-kube-api-access-ptzc8\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985796 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985848 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data-custom\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985916 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef2690e-bd56-4516-8e34-24d5b6299a11-logs\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.985956 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data-custom\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.991140 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv"] Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.992295 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:41 crc kubenswrapper[4819]: I0228 03:53:41.994992 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.038259 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv"] Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087328 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptzc8\" (UniqueName: \"kubernetes.io/projected/1ef2690e-bd56-4516-8e34-24d5b6299a11-kube-api-access-ptzc8\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087425 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087455 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data-custom\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087485 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef2690e-bd56-4516-8e34-24d5b6299a11-logs\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087513 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data-custom\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087552 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8x2v\" (UniqueName: \"kubernetes.io/projected/55cdaae2-d1fa-4630-a612-1f1418a0a640-kube-api-access-w8x2v\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087581 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087630 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cdaae2-d1fa-4630-a612-1f1418a0a640-logs\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.087969 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef2690e-bd56-4516-8e34-24d5b6299a11-logs\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.088107 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cdaae2-d1fa-4630-a612-1f1418a0a640-logs\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.092136 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data-custom\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.092433 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.095441 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.102218 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data-custom\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.105146 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptzc8\" (UniqueName: \"kubernetes.io/projected/1ef2690e-bd56-4516-8e34-24d5b6299a11-kube-api-access-ptzc8\") pod \"barbican-keystone-listener-5bdd47f94d-pskq9\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.112978 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8x2v\" (UniqueName: \"kubernetes.io/projected/55cdaae2-d1fa-4630-a612-1f1418a0a640-kube-api-access-w8x2v\") pod \"barbican-worker-64c846bbd9-c8mk8\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.158125 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.169805 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.192859 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data-custom\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.192978 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26h4q\" (UniqueName: \"kubernetes.io/projected/a394bdbf-e06e-4ed6-b77f-93885589bded-kube-api-access-26h4q\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.193031 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.193100 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a394bdbf-e06e-4ed6-b77f-93885589bded-logs\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.296010 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data-custom\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.296071 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26h4q\" (UniqueName: \"kubernetes.io/projected/a394bdbf-e06e-4ed6-b77f-93885589bded-kube-api-access-26h4q\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.296098 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.296133 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a394bdbf-e06e-4ed6-b77f-93885589bded-logs\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.297699 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a394bdbf-e06e-4ed6-b77f-93885589bded-logs\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.303645 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.314907 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26h4q\" (UniqueName: \"kubernetes.io/projected/a394bdbf-e06e-4ed6-b77f-93885589bded-kube-api-access-26h4q\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.316058 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data-custom\") pod \"barbican-api-5654c9f87b-cv9wv\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.441994 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8"] Feb 28 03:53:42 crc kubenswrapper[4819]: W0228 03:53:42.447649 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55cdaae2_d1fa_4630_a612_1f1418a0a640.slice/crio-0979827e38e2e9535bcc430d271cc195f1ca678b0189aefff166cc4002372092 WatchSource:0}: Error finding container 0979827e38e2e9535bcc430d271cc195f1ca678b0189aefff166cc4002372092: Status 404 returned error can't find the container with id 0979827e38e2e9535bcc430d271cc195f1ca678b0189aefff166cc4002372092 Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.611813 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.622539 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9"] Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.622785 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" event={"ID":"55cdaae2-d1fa-4630-a612-1f1418a0a640","Type":"ContainerStarted","Data":"0979827e38e2e9535bcc430d271cc195f1ca678b0189aefff166cc4002372092"} Feb 28 03:53:42 crc kubenswrapper[4819]: W0228 03:53:42.634681 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ef2690e_bd56_4516_8e34_24d5b6299a11.slice/crio-1599f5462f3b9363b483d4964fb9c57203cecb441cec3b5f92f7988cddfc271a WatchSource:0}: Error finding container 1599f5462f3b9363b483d4964fb9c57203cecb441cec3b5f92f7988cddfc271a: Status 404 returned error can't find the container with id 1599f5462f3b9363b483d4964fb9c57203cecb441cec3b5f92f7988cddfc271a Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.896956 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz"] Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.898087 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.913832 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz"] Feb 28 03:53:42 crc kubenswrapper[4819]: I0228 03:53:42.934403 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv"] Feb 28 03:53:42 crc kubenswrapper[4819]: W0228 03:53:42.945121 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda394bdbf_e06e_4ed6_b77f_93885589bded.slice/crio-e115135b0720efbfedda5f49189d82f675d3377e465ff7a9131f93ab70ab0ad8 WatchSource:0}: Error finding container e115135b0720efbfedda5f49189d82f675d3377e465ff7a9131f93ab70ab0ad8: Status 404 returned error can't find the container with id e115135b0720efbfedda5f49189d82f675d3377e465ff7a9131f93ab70ab0ad8 Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.006416 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-logs\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.006730 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data-custom\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.007547 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnnpg\" (UniqueName: \"kubernetes.io/projected/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-kube-api-access-lnnpg\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.007824 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.065621 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg"] Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.080751 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.098652 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg"] Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120674 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8w2j\" (UniqueName: \"kubernetes.io/projected/398688ee-ba77-4410-8294-42eaffb91650-kube-api-access-b8w2j\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120725 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120751 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-logs\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120772 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data-custom\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120790 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398688ee-ba77-4410-8294-42eaffb91650-logs\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120811 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120835 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data-custom\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.120875 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnnpg\" (UniqueName: \"kubernetes.io/projected/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-kube-api-access-lnnpg\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.122421 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-logs\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.125956 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data-custom\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.130051 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.137837 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnnpg\" (UniqueName: \"kubernetes.io/projected/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-kube-api-access-lnnpg\") pod \"barbican-api-5654c9f87b-vgddz\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.222486 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.222599 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8w2j\" (UniqueName: \"kubernetes.io/projected/398688ee-ba77-4410-8294-42eaffb91650-kube-api-access-b8w2j\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.222633 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data-custom\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.222648 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398688ee-ba77-4410-8294-42eaffb91650-logs\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.223193 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398688ee-ba77-4410-8294-42eaffb91650-logs\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.228210 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data-custom\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.229089 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.232570 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.253373 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8w2j\" (UniqueName: \"kubernetes.io/projected/398688ee-ba77-4410-8294-42eaffb91650-kube-api-access-b8w2j\") pod \"barbican-keystone-listener-5bdd47f94d-df2rg\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.284099 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l"] Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.285508 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.293316 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l"] Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.426182 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.426254 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data-custom\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.426911 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs9b\" (UniqueName: \"kubernetes.io/projected/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-kube-api-access-cxs9b\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.427009 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-logs\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.474584 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.528221 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.528319 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data-custom\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.528402 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs9b\" (UniqueName: \"kubernetes.io/projected/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-kube-api-access-cxs9b\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.528429 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-logs\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.528927 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-logs\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.548767 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.548818 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data-custom\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.565763 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs9b\" (UniqueName: \"kubernetes.io/projected/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-kube-api-access-cxs9b\") pod \"barbican-worker-64c846bbd9-7pv7l\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.603926 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.638259 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" event={"ID":"1ef2690e-bd56-4516-8e34-24d5b6299a11","Type":"ContainerStarted","Data":"1599f5462f3b9363b483d4964fb9c57203cecb441cec3b5f92f7988cddfc271a"} Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.639974 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" event={"ID":"a394bdbf-e06e-4ed6-b77f-93885589bded","Type":"ContainerStarted","Data":"2433185f425eb19da607466024b391582e4ee7562d5f3a0458c511e81b9fb905"} Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.640001 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" event={"ID":"a394bdbf-e06e-4ed6-b77f-93885589bded","Type":"ContainerStarted","Data":"90e42d2b7ca1d801b66730ab91fb18677f68bbf54c3d05d2bcf8548bea9b0565"} Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.640012 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" event={"ID":"a394bdbf-e06e-4ed6-b77f-93885589bded","Type":"ContainerStarted","Data":"e115135b0720efbfedda5f49189d82f675d3377e465ff7a9131f93ab70ab0ad8"} Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.640156 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.640208 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.656205 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz"] Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.658798 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" podStartSLOduration=2.6587829960000002 podStartE2EDuration="2.658782996s" podCreationTimestamp="2026-02-28 03:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:53:43.657533055 +0000 UTC m=+1162.123101913" watchObservedRunningTime="2026-02-28 03:53:43.658782996 +0000 UTC m=+1162.124351854" Feb 28 03:53:43 crc kubenswrapper[4819]: I0228 03:53:43.694301 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg"] Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.057672 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l"] Feb 28 03:53:44 crc kubenswrapper[4819]: W0228 03:53:44.187857 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode49c04aa_14d4_4d76_ae1f_73c8aa193b2c.slice/crio-9015c2cbeefa07ac93b4e9e9324e3c9ca23925ee9207d178a21161ae0865312e WatchSource:0}: Error finding container 9015c2cbeefa07ac93b4e9e9324e3c9ca23925ee9207d178a21161ae0865312e: Status 404 returned error can't find the container with id 9015c2cbeefa07ac93b4e9e9324e3c9ca23925ee9207d178a21161ae0865312e Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.405572 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz"] Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.652706 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" event={"ID":"55cdaae2-d1fa-4630-a612-1f1418a0a640","Type":"ContainerStarted","Data":"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.652751 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" event={"ID":"55cdaae2-d1fa-4630-a612-1f1418a0a640","Type":"ContainerStarted","Data":"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.656646 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" event={"ID":"398688ee-ba77-4410-8294-42eaffb91650","Type":"ContainerStarted","Data":"8d9a94176a82fab753625a4eaa119da38625c376a63e494777974f1db1cf05e9"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.658087 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" event={"ID":"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c","Type":"ContainerStarted","Data":"9015c2cbeefa07ac93b4e9e9324e3c9ca23925ee9207d178a21161ae0865312e"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.665779 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" event={"ID":"1ef2690e-bd56-4516-8e34-24d5b6299a11","Type":"ContainerStarted","Data":"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.665839 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" event={"ID":"1ef2690e-bd56-4516-8e34-24d5b6299a11","Type":"ContainerStarted","Data":"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.673829 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" event={"ID":"08a26f1d-d188-41ce-8a4f-bc52a2d8492f","Type":"ContainerStarted","Data":"85953e08ebfdef4466a3c04065f20174d6747c3e50fd306058993b49070a4421"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.673881 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" event={"ID":"08a26f1d-d188-41ce-8a4f-bc52a2d8492f","Type":"ContainerStarted","Data":"a5059f98626fd4d049ccd48c8fc212c6336a4c7f426f9ba0951eef14c493f5ba"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.673895 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" event={"ID":"08a26f1d-d188-41ce-8a4f-bc52a2d8492f","Type":"ContainerStarted","Data":"ae0b2568c716fda617f4a6d45e6a3a37c45c72246f37c450b591cf755d6f8a13"} Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.673837 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" containerID="cri-o://a5059f98626fd4d049ccd48c8fc212c6336a4c7f426f9ba0951eef14c493f5ba" gracePeriod=30 Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.673968 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.673996 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.674102 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" containerID="cri-o://85953e08ebfdef4466a3c04065f20174d6747c3e50fd306058993b49070a4421" gracePeriod=30 Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.676217 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" podStartSLOduration=1.895364604 podStartE2EDuration="3.676190352s" podCreationTimestamp="2026-02-28 03:53:41 +0000 UTC" firstStartedPulling="2026-02-28 03:53:42.44951852 +0000 UTC m=+1160.915087378" lastFinishedPulling="2026-02-28 03:53:44.230344258 +0000 UTC m=+1162.695913126" observedRunningTime="2026-02-28 03:53:44.669839766 +0000 UTC m=+1163.135408654" watchObservedRunningTime="2026-02-28 03:53:44.676190352 +0000 UTC m=+1163.141759230" Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.704468 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" podStartSLOduration=2.110228299 podStartE2EDuration="3.704442078s" podCreationTimestamp="2026-02-28 03:53:41 +0000 UTC" firstStartedPulling="2026-02-28 03:53:42.638091947 +0000 UTC m=+1161.103660855" lastFinishedPulling="2026-02-28 03:53:44.232305766 +0000 UTC m=+1162.697874634" observedRunningTime="2026-02-28 03:53:44.694908264 +0000 UTC m=+1163.160477172" watchObservedRunningTime="2026-02-28 03:53:44.704442078 +0000 UTC m=+1163.170010946" Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.722990 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg"] Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.730171 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podStartSLOduration=2.7301534419999998 podStartE2EDuration="2.730153442s" podCreationTimestamp="2026-02-28 03:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:53:44.717405138 +0000 UTC m=+1163.182974006" watchObservedRunningTime="2026-02-28 03:53:44.730153442 +0000 UTC m=+1163.195722300" Feb 28 03:53:44 crc kubenswrapper[4819]: I0228 03:53:44.804376 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l"] Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.685008 4819 generic.go:334] "Generic (PLEG): container finished" podID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerID="a5059f98626fd4d049ccd48c8fc212c6336a4c7f426f9ba0951eef14c493f5ba" exitCode=143 Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.685128 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" event={"ID":"08a26f1d-d188-41ce-8a4f-bc52a2d8492f","Type":"ContainerDied","Data":"a5059f98626fd4d049ccd48c8fc212c6336a4c7f426f9ba0951eef14c493f5ba"} Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.687540 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" event={"ID":"398688ee-ba77-4410-8294-42eaffb91650","Type":"ContainerStarted","Data":"389b6726bf5f90963538c33367844b38e39c3836d02eed00c327a21d4aeb46d8"} Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.687576 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" event={"ID":"398688ee-ba77-4410-8294-42eaffb91650","Type":"ContainerStarted","Data":"8a49fc8fafc6a1f43a727f2674b36df16257d18309f28060a024ea0cc865b1ce"} Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.687716 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener-log" containerID="cri-o://8a49fc8fafc6a1f43a727f2674b36df16257d18309f28060a024ea0cc865b1ce" gracePeriod=30 Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.689666 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener" containerID="cri-o://389b6726bf5f90963538c33367844b38e39c3836d02eed00c327a21d4aeb46d8" gracePeriod=30 Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.694443 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker-log" containerID="cri-o://ef2e85c4e5ffa8da0a9736d2baedadfe364b23842f758ba61c75bad084772775" gracePeriod=30 Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.694550 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker" containerID="cri-o://5d3a04cdef83005e050d26de1a862b569d03dfe3a5039255a27a4917f1cb7f7f" gracePeriod=30 Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.694660 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" event={"ID":"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c","Type":"ContainerStarted","Data":"5d3a04cdef83005e050d26de1a862b569d03dfe3a5039255a27a4917f1cb7f7f"} Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.694686 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" event={"ID":"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c","Type":"ContainerStarted","Data":"ef2e85c4e5ffa8da0a9736d2baedadfe364b23842f758ba61c75bad084772775"} Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.716035 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" podStartSLOduration=2.322794805 podStartE2EDuration="2.716005152s" podCreationTimestamp="2026-02-28 03:53:43 +0000 UTC" firstStartedPulling="2026-02-28 03:53:44.182044188 +0000 UTC m=+1162.647613036" lastFinishedPulling="2026-02-28 03:53:44.575254525 +0000 UTC m=+1163.040823383" observedRunningTime="2026-02-28 03:53:45.710343453 +0000 UTC m=+1164.175912391" watchObservedRunningTime="2026-02-28 03:53:45.716005152 +0000 UTC m=+1164.181574050" Feb 28 03:53:45 crc kubenswrapper[4819]: I0228 03:53:45.737194 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" podStartSLOduration=2.354099836 podStartE2EDuration="2.737167274s" podCreationTimestamp="2026-02-28 03:53:43 +0000 UTC" firstStartedPulling="2026-02-28 03:53:44.216313902 +0000 UTC m=+1162.681882760" lastFinishedPulling="2026-02-28 03:53:44.59938134 +0000 UTC m=+1163.064950198" observedRunningTime="2026-02-28 03:53:45.732034447 +0000 UTC m=+1164.197603335" watchObservedRunningTime="2026-02-28 03:53:45.737167274 +0000 UTC m=+1164.202736172" Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.030571 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv"] Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.030790 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api-log" containerID="cri-o://90e42d2b7ca1d801b66730ab91fb18677f68bbf54c3d05d2bcf8548bea9b0565" gracePeriod=30 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.030911 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api" containerID="cri-o://2433185f425eb19da607466024b391582e4ee7562d5f3a0458c511e81b9fb905" gracePeriod=30 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.298123 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9"] Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.425080 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8"] Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.707413 4819 generic.go:334] "Generic (PLEG): container finished" podID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerID="2433185f425eb19da607466024b391582e4ee7562d5f3a0458c511e81b9fb905" exitCode=0 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.707449 4819 generic.go:334] "Generic (PLEG): container finished" podID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerID="90e42d2b7ca1d801b66730ab91fb18677f68bbf54c3d05d2bcf8548bea9b0565" exitCode=143 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.707459 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" event={"ID":"a394bdbf-e06e-4ed6-b77f-93885589bded","Type":"ContainerDied","Data":"2433185f425eb19da607466024b391582e4ee7562d5f3a0458c511e81b9fb905"} Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.707514 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" event={"ID":"a394bdbf-e06e-4ed6-b77f-93885589bded","Type":"ContainerDied","Data":"90e42d2b7ca1d801b66730ab91fb18677f68bbf54c3d05d2bcf8548bea9b0565"} Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.709314 4819 generic.go:334] "Generic (PLEG): container finished" podID="398688ee-ba77-4410-8294-42eaffb91650" containerID="8a49fc8fafc6a1f43a727f2674b36df16257d18309f28060a024ea0cc865b1ce" exitCode=143 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.709408 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" event={"ID":"398688ee-ba77-4410-8294-42eaffb91650","Type":"ContainerDied","Data":"8a49fc8fafc6a1f43a727f2674b36df16257d18309f28060a024ea0cc865b1ce"} Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.710996 4819 generic.go:334] "Generic (PLEG): container finished" podID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerID="ef2e85c4e5ffa8da0a9736d2baedadfe364b23842f758ba61c75bad084772775" exitCode=143 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.711196 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker-log" containerID="cri-o://3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9" gracePeriod=30 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.711347 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker" containerID="cri-o://aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517" gracePeriod=30 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.711395 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" event={"ID":"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c","Type":"ContainerDied","Data":"ef2e85c4e5ffa8da0a9736d2baedadfe364b23842f758ba61c75bad084772775"} Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.711478 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener-log" containerID="cri-o://ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c" gracePeriod=30 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.711503 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener" containerID="cri-o://55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5" gracePeriod=30 Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.891462 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.980934 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26h4q\" (UniqueName: \"kubernetes.io/projected/a394bdbf-e06e-4ed6-b77f-93885589bded-kube-api-access-26h4q\") pod \"a394bdbf-e06e-4ed6-b77f-93885589bded\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.981042 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data\") pod \"a394bdbf-e06e-4ed6-b77f-93885589bded\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.981108 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data-custom\") pod \"a394bdbf-e06e-4ed6-b77f-93885589bded\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.981134 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a394bdbf-e06e-4ed6-b77f-93885589bded-logs\") pod \"a394bdbf-e06e-4ed6-b77f-93885589bded\" (UID: \"a394bdbf-e06e-4ed6-b77f-93885589bded\") " Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.982498 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a394bdbf-e06e-4ed6-b77f-93885589bded-logs" (OuterVolumeSpecName: "logs") pod "a394bdbf-e06e-4ed6-b77f-93885589bded" (UID: "a394bdbf-e06e-4ed6-b77f-93885589bded"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.990398 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a394bdbf-e06e-4ed6-b77f-93885589bded-kube-api-access-26h4q" (OuterVolumeSpecName: "kube-api-access-26h4q") pod "a394bdbf-e06e-4ed6-b77f-93885589bded" (UID: "a394bdbf-e06e-4ed6-b77f-93885589bded"). InnerVolumeSpecName "kube-api-access-26h4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:46 crc kubenswrapper[4819]: I0228 03:53:46.991831 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a394bdbf-e06e-4ed6-b77f-93885589bded" (UID: "a394bdbf-e06e-4ed6-b77f-93885589bded"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.058277 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data" (OuterVolumeSpecName: "config-data") pod "a394bdbf-e06e-4ed6-b77f-93885589bded" (UID: "a394bdbf-e06e-4ed6-b77f-93885589bded"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.082911 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.082938 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a394bdbf-e06e-4ed6-b77f-93885589bded-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.082948 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a394bdbf-e06e-4ed6-b77f-93885589bded-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.082957 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26h4q\" (UniqueName: \"kubernetes.io/projected/a394bdbf-e06e-4ed6-b77f-93885589bded-kube-api-access-26h4q\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.568288 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.578254 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696332 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptzc8\" (UniqueName: \"kubernetes.io/projected/1ef2690e-bd56-4516-8e34-24d5b6299a11-kube-api-access-ptzc8\") pod \"1ef2690e-bd56-4516-8e34-24d5b6299a11\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696488 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8x2v\" (UniqueName: \"kubernetes.io/projected/55cdaae2-d1fa-4630-a612-1f1418a0a640-kube-api-access-w8x2v\") pod \"55cdaae2-d1fa-4630-a612-1f1418a0a640\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696527 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef2690e-bd56-4516-8e34-24d5b6299a11-logs\") pod \"1ef2690e-bd56-4516-8e34-24d5b6299a11\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696612 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data\") pod \"55cdaae2-d1fa-4630-a612-1f1418a0a640\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696704 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data-custom\") pod \"55cdaae2-d1fa-4630-a612-1f1418a0a640\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696759 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cdaae2-d1fa-4630-a612-1f1418a0a640-logs\") pod \"55cdaae2-d1fa-4630-a612-1f1418a0a640\" (UID: \"55cdaae2-d1fa-4630-a612-1f1418a0a640\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696796 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data\") pod \"1ef2690e-bd56-4516-8e34-24d5b6299a11\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.696834 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data-custom\") pod \"1ef2690e-bd56-4516-8e34-24d5b6299a11\" (UID: \"1ef2690e-bd56-4516-8e34-24d5b6299a11\") " Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.702263 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1ef2690e-bd56-4516-8e34-24d5b6299a11" (UID: "1ef2690e-bd56-4516-8e34-24d5b6299a11"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.705088 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef2690e-bd56-4516-8e34-24d5b6299a11-logs" (OuterVolumeSpecName: "logs") pod "1ef2690e-bd56-4516-8e34-24d5b6299a11" (UID: "1ef2690e-bd56-4516-8e34-24d5b6299a11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.705531 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cdaae2-d1fa-4630-a612-1f1418a0a640-logs" (OuterVolumeSpecName: "logs") pod "55cdaae2-d1fa-4630-a612-1f1418a0a640" (UID: "55cdaae2-d1fa-4630-a612-1f1418a0a640"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.705849 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef2690e-bd56-4516-8e34-24d5b6299a11-kube-api-access-ptzc8" (OuterVolumeSpecName: "kube-api-access-ptzc8") pod "1ef2690e-bd56-4516-8e34-24d5b6299a11" (UID: "1ef2690e-bd56-4516-8e34-24d5b6299a11"). InnerVolumeSpecName "kube-api-access-ptzc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.705885 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fcrsq"] Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.706056 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cdaae2-d1fa-4630-a612-1f1418a0a640-kube-api-access-w8x2v" (OuterVolumeSpecName: "kube-api-access-w8x2v") pod "55cdaae2-d1fa-4630-a612-1f1418a0a640" (UID: "55cdaae2-d1fa-4630-a612-1f1418a0a640"). InnerVolumeSpecName "kube-api-access-w8x2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.712876 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "55cdaae2-d1fa-4630-a612-1f1418a0a640" (UID: "55cdaae2-d1fa-4630-a612-1f1418a0a640"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.718667 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fcrsq"] Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.725878 4819 generic.go:334] "Generic (PLEG): container finished" podID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerID="55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5" exitCode=0 Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.725904 4819 generic.go:334] "Generic (PLEG): container finished" podID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerID="ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c" exitCode=143 Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.725965 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" event={"ID":"1ef2690e-bd56-4516-8e34-24d5b6299a11","Type":"ContainerDied","Data":"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.725995 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" event={"ID":"1ef2690e-bd56-4516-8e34-24d5b6299a11","Type":"ContainerDied","Data":"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.726011 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" event={"ID":"1ef2690e-bd56-4516-8e34-24d5b6299a11","Type":"ContainerDied","Data":"1599f5462f3b9363b483d4964fb9c57203cecb441cec3b5f92f7988cddfc271a"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.726029 4819 scope.go:117] "RemoveContainer" containerID="55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.726188 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.730400 4819 generic.go:334] "Generic (PLEG): container finished" podID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerID="aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517" exitCode=0 Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.730429 4819 generic.go:334] "Generic (PLEG): container finished" podID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerID="3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9" exitCode=143 Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.730498 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" event={"ID":"55cdaae2-d1fa-4630-a612-1f1418a0a640","Type":"ContainerDied","Data":"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.730528 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" event={"ID":"55cdaae2-d1fa-4630-a612-1f1418a0a640","Type":"ContainerDied","Data":"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.730539 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" event={"ID":"55cdaae2-d1fa-4630-a612-1f1418a0a640","Type":"ContainerDied","Data":"0979827e38e2e9535bcc430d271cc195f1ca678b0189aefff166cc4002372092"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.730607 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.743784 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" event={"ID":"a394bdbf-e06e-4ed6-b77f-93885589bded","Type":"ContainerDied","Data":"e115135b0720efbfedda5f49189d82f675d3377e465ff7a9131f93ab70ab0ad8"} Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.743886 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.750604 4819 scope.go:117] "RemoveContainer" containerID="ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.756517 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data" (OuterVolumeSpecName: "config-data") pod "55cdaae2-d1fa-4630-a612-1f1418a0a640" (UID: "55cdaae2-d1fa-4630-a612-1f1418a0a640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.757237 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data" (OuterVolumeSpecName: "config-data") pod "1ef2690e-bd56-4516-8e34-24d5b6299a11" (UID: "1ef2690e-bd56-4516-8e34-24d5b6299a11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.778676 4819 scope.go:117] "RemoveContainer" containerID="55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.791060 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5\": container with ID starting with 55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5 not found: ID does not exist" containerID="55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.791118 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5"} err="failed to get container status \"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5\": rpc error: code = NotFound desc = could not find container \"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5\": container with ID starting with 55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5 not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.791158 4819 scope.go:117] "RemoveContainer" containerID="ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.794413 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c\": container with ID starting with ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c not found: ID does not exist" containerID="ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.794480 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c"} err="failed to get container status \"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c\": rpc error: code = NotFound desc = could not find container \"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c\": container with ID starting with ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.794516 4819 scope.go:117] "RemoveContainer" containerID="55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.794635 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv"] Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.795557 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5"} err="failed to get container status \"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5\": rpc error: code = NotFound desc = could not find container \"55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5\": container with ID starting with 55fa1a85eece7d67c66c0d8013e5f79f6e4ae75f6dce9670104afd2b26e12ab5 not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.795585 4819 scope.go:117] "RemoveContainer" containerID="ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.798438 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c"} err="failed to get container status \"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c\": rpc error: code = NotFound desc = could not find container \"ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c\": container with ID starting with ac6cdd23ae0e88c2983461783f8a5370371969da35d13cdf425b09580d7ffb7c not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.798489 4819 scope.go:117] "RemoveContainer" containerID="aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799045 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican0662-account-delete-vsrf5"] Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799141 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799171 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cdaae2-d1fa-4630-a612-1f1418a0a640-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799184 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799193 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ef2690e-bd56-4516-8e34-24d5b6299a11-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799203 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptzc8\" (UniqueName: \"kubernetes.io/projected/1ef2690e-bd56-4516-8e34-24d5b6299a11-kube-api-access-ptzc8\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799213 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8x2v\" (UniqueName: \"kubernetes.io/projected/55cdaae2-d1fa-4630-a612-1f1418a0a640-kube-api-access-w8x2v\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799222 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ef2690e-bd56-4516-8e34-24d5b6299a11-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799232 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cdaae2-d1fa-4630-a612-1f1418a0a640-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.799324 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799337 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.799356 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799364 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.799372 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799379 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api-log" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.799393 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799399 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.799408 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799414 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener-log" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.799424 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799430 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799542 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799551 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799560 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" containerName="barbican-worker-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799569 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" containerName="barbican-keystone-listener" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799578 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api-log" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.799587 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" containerName="barbican-api" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.800031 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.804230 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-cv9wv"] Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.814226 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican0662-account-delete-vsrf5"] Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.822115 4819 scope.go:117] "RemoveContainer" containerID="3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.843095 4819 scope.go:117] "RemoveContainer" containerID="aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.844701 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517\": container with ID starting with aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517 not found: ID does not exist" containerID="aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.844780 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517"} err="failed to get container status \"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517\": rpc error: code = NotFound desc = could not find container \"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517\": container with ID starting with aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517 not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.844805 4819 scope.go:117] "RemoveContainer" containerID="3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.845694 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9\": container with ID starting with 3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9 not found: ID does not exist" containerID="3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.845723 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9"} err="failed to get container status \"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9\": rpc error: code = NotFound desc = could not find container \"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9\": container with ID starting with 3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9 not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.845738 4819 scope.go:117] "RemoveContainer" containerID="aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.846331 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517"} err="failed to get container status \"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517\": rpc error: code = NotFound desc = could not find container \"aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517\": container with ID starting with aee2da53e8b797de58a5bb9823a085146fe0e8f2e597654f76e71f68bbfbb517 not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.846345 4819 scope.go:117] "RemoveContainer" containerID="3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.846497 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9"} err="failed to get container status \"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9\": rpc error: code = NotFound desc = could not find container \"3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9\": container with ID starting with 3c913e6cb43097655c8f49f99e57077d2bca4bee9a43301b9dceec7c893dd5e9 not found: ID does not exist" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.846512 4819 scope.go:117] "RemoveContainer" containerID="2433185f425eb19da607466024b391582e4ee7562d5f3a0458c511e81b9fb905" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.860346 4819 scope.go:117] "RemoveContainer" containerID="90e42d2b7ca1d801b66730ab91fb18677f68bbf54c3d05d2bcf8548bea9b0565" Feb 28 03:53:47 crc kubenswrapper[4819]: E0228 03:53:47.875365 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda394bdbf_e06e_4ed6_b77f_93885589bded.slice\": RecentStats: unable to find data in memory cache]" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.900152 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljlt7\" (UniqueName: \"kubernetes.io/projected/f52a10b3-813e-439c-b3f6-1a5970cd5eec-kube-api-access-ljlt7\") pod \"barbican0662-account-delete-vsrf5\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:47 crc kubenswrapper[4819]: I0228 03:53:47.900403 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52a10b3-813e-439c-b3f6-1a5970cd5eec-operator-scripts\") pod \"barbican0662-account-delete-vsrf5\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.002564 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52a10b3-813e-439c-b3f6-1a5970cd5eec-operator-scripts\") pod \"barbican0662-account-delete-vsrf5\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.002909 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlt7\" (UniqueName: \"kubernetes.io/projected/f52a10b3-813e-439c-b3f6-1a5970cd5eec-kube-api-access-ljlt7\") pod \"barbican0662-account-delete-vsrf5\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.003606 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52a10b3-813e-439c-b3f6-1a5970cd5eec-operator-scripts\") pod \"barbican0662-account-delete-vsrf5\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.029871 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljlt7\" (UniqueName: \"kubernetes.io/projected/f52a10b3-813e-439c-b3f6-1a5970cd5eec-kube-api-access-ljlt7\") pod \"barbican0662-account-delete-vsrf5\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.054430 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9"] Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.064372 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-pskq9"] Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.069815 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8"] Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.074431 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-c8mk8"] Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.115417 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.377889 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef2690e-bd56-4516-8e34-24d5b6299a11" path="/var/lib/kubelet/pods/1ef2690e-bd56-4516-8e34-24d5b6299a11/volumes" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.379060 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cdaae2-d1fa-4630-a612-1f1418a0a640" path="/var/lib/kubelet/pods/55cdaae2-d1fa-4630-a612-1f1418a0a640/volumes" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.379859 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a394bdbf-e06e-4ed6-b77f-93885589bded" path="/var/lib/kubelet/pods/a394bdbf-e06e-4ed6-b77f-93885589bded/volumes" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.381198 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c43a99-ce90-404f-bc48-191bd422fced" path="/var/lib/kubelet/pods/c8c43a99-ce90-404f-bc48-191bd422fced/volumes" Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.423945 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican0662-account-delete-vsrf5"] Feb 28 03:53:48 crc kubenswrapper[4819]: W0228 03:53:48.434713 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf52a10b3_813e_439c_b3f6_1a5970cd5eec.slice/crio-4796058132cc5024e7b3eeaf7cc7718551e561b71f523144331a3848004e19bf WatchSource:0}: Error finding container 4796058132cc5024e7b3eeaf7cc7718551e561b71f523144331a3848004e19bf: Status 404 returned error can't find the container with id 4796058132cc5024e7b3eeaf7cc7718551e561b71f523144331a3848004e19bf Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.754776 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" event={"ID":"f52a10b3-813e-439c-b3f6-1a5970cd5eec","Type":"ContainerStarted","Data":"3e331c43edcbe05ce304e52e88d7c8531de494cb84650dbf542ea8f661de5b21"} Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.754824 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" event={"ID":"f52a10b3-813e-439c-b3f6-1a5970cd5eec","Type":"ContainerStarted","Data":"4796058132cc5024e7b3eeaf7cc7718551e561b71f523144331a3848004e19bf"} Feb 28 03:53:48 crc kubenswrapper[4819]: I0228 03:53:48.786385 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" podStartSLOduration=1.7863607369999999 podStartE2EDuration="1.786360737s" podCreationTimestamp="2026-02-28 03:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:53:48.775570551 +0000 UTC m=+1167.241139439" watchObservedRunningTime="2026-02-28 03:53:48.786360737 +0000 UTC m=+1167.251929605" Feb 28 03:53:49 crc kubenswrapper[4819]: I0228 03:53:49.769380 4819 generic.go:334] "Generic (PLEG): container finished" podID="f52a10b3-813e-439c-b3f6-1a5970cd5eec" containerID="3e331c43edcbe05ce304e52e88d7c8531de494cb84650dbf542ea8f661de5b21" exitCode=0 Feb 28 03:53:49 crc kubenswrapper[4819]: I0228 03:53:49.769462 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" event={"ID":"f52a10b3-813e-439c-b3f6-1a5970cd5eec","Type":"ContainerDied","Data":"3e331c43edcbe05ce304e52e88d7c8531de494cb84650dbf542ea8f661de5b21"} Feb 28 03:53:50 crc kubenswrapper[4819]: I0228 03:53:50.151642 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.074460 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.253919 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljlt7\" (UniqueName: \"kubernetes.io/projected/f52a10b3-813e-439c-b3f6-1a5970cd5eec-kube-api-access-ljlt7\") pod \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.253993 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52a10b3-813e-439c-b3f6-1a5970cd5eec-operator-scripts\") pod \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\" (UID: \"f52a10b3-813e-439c-b3f6-1a5970cd5eec\") " Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.254997 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52a10b3-813e-439c-b3f6-1a5970cd5eec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f52a10b3-813e-439c-b3f6-1a5970cd5eec" (UID: "f52a10b3-813e-439c-b3f6-1a5970cd5eec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.262986 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52a10b3-813e-439c-b3f6-1a5970cd5eec-kube-api-access-ljlt7" (OuterVolumeSpecName: "kube-api-access-ljlt7") pod "f52a10b3-813e-439c-b3f6-1a5970cd5eec" (UID: "f52a10b3-813e-439c-b3f6-1a5970cd5eec"). InnerVolumeSpecName "kube-api-access-ljlt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.356850 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljlt7\" (UniqueName: \"kubernetes.io/projected/f52a10b3-813e-439c-b3f6-1a5970cd5eec-kube-api-access-ljlt7\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.356911 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52a10b3-813e-439c-b3f6-1a5970cd5eec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.791078 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" event={"ID":"f52a10b3-813e-439c-b3f6-1a5970cd5eec","Type":"ContainerDied","Data":"4796058132cc5024e7b3eeaf7cc7718551e561b71f523144331a3848004e19bf"} Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.791133 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4796058132cc5024e7b3eeaf7cc7718551e561b71f523144331a3848004e19bf" Feb 28 03:53:51 crc kubenswrapper[4819]: I0228 03:53:51.791182 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican0662-account-delete-vsrf5" Feb 28 03:53:52 crc kubenswrapper[4819]: I0228 03:53:52.827852 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zf659"] Feb 28 03:53:52 crc kubenswrapper[4819]: I0228 03:53:52.841791 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-zf659"] Feb 28 03:53:52 crc kubenswrapper[4819]: I0228 03:53:52.851358 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican0662-account-delete-vsrf5"] Feb 28 03:53:52 crc kubenswrapper[4819]: I0228 03:53:52.857480 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican0662-account-delete-vsrf5"] Feb 28 03:53:52 crc kubenswrapper[4819]: I0228 03:53:52.863062 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt"] Feb 28 03:53:52 crc kubenswrapper[4819]: I0228 03:53:52.868541 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-0662-account-create-update-zcwnt"] Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.026434 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-696cp"] Feb 28 03:53:53 crc kubenswrapper[4819]: E0228 03:53:53.028726 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52a10b3-813e-439c-b3f6-1a5970cd5eec" containerName="mariadb-account-delete" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.028758 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52a10b3-813e-439c-b3f6-1a5970cd5eec" containerName="mariadb-account-delete" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.028913 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52a10b3-813e-439c-b3f6-1a5970cd5eec" containerName="mariadb-account-delete" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.029491 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.034238 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk"] Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.035402 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.037507 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.042404 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-696cp"] Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.063821 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk"] Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.185749 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ccad3f-3977-49ba-93a9-157d2fe8aa84-operator-scripts\") pod \"barbican-db-create-696cp\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.185801 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jz8\" (UniqueName: \"kubernetes.io/projected/46ccad3f-3977-49ba-93a9-157d2fe8aa84-kube-api-access-k5jz8\") pod \"barbican-db-create-696cp\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.185830 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69910ece-3df8-46e9-96c0-329dc732e16e-operator-scripts\") pod \"barbican-e8ca-account-create-update-8gtjk\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.185923 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5p6d\" (UniqueName: \"kubernetes.io/projected/69910ece-3df8-46e9-96c0-329dc732e16e-kube-api-access-q5p6d\") pod \"barbican-e8ca-account-create-update-8gtjk\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.287788 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5p6d\" (UniqueName: \"kubernetes.io/projected/69910ece-3df8-46e9-96c0-329dc732e16e-kube-api-access-q5p6d\") pod \"barbican-e8ca-account-create-update-8gtjk\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.287969 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ccad3f-3977-49ba-93a9-157d2fe8aa84-operator-scripts\") pod \"barbican-db-create-696cp\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.288014 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jz8\" (UniqueName: \"kubernetes.io/projected/46ccad3f-3977-49ba-93a9-157d2fe8aa84-kube-api-access-k5jz8\") pod \"barbican-db-create-696cp\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.288047 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69910ece-3df8-46e9-96c0-329dc732e16e-operator-scripts\") pod \"barbican-e8ca-account-create-update-8gtjk\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.289388 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69910ece-3df8-46e9-96c0-329dc732e16e-operator-scripts\") pod \"barbican-e8ca-account-create-update-8gtjk\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.289406 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ccad3f-3977-49ba-93a9-157d2fe8aa84-operator-scripts\") pod \"barbican-db-create-696cp\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.311158 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jz8\" (UniqueName: \"kubernetes.io/projected/46ccad3f-3977-49ba-93a9-157d2fe8aa84-kube-api-access-k5jz8\") pod \"barbican-db-create-696cp\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.317853 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5p6d\" (UniqueName: \"kubernetes.io/projected/69910ece-3df8-46e9-96c0-329dc732e16e-kube-api-access-q5p6d\") pod \"barbican-e8ca-account-create-update-8gtjk\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.353465 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.359379 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.623878 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-696cp"] Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.660895 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk"] Feb 28 03:53:53 crc kubenswrapper[4819]: W0228 03:53:53.681429 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69910ece_3df8_46e9_96c0_329dc732e16e.slice/crio-e093dd018a61e1e7dee28f4dada00807205958bef7886c8eefeb06c8b0ae17e9 WatchSource:0}: Error finding container e093dd018a61e1e7dee28f4dada00807205958bef7886c8eefeb06c8b0ae17e9: Status 404 returned error can't find the container with id e093dd018a61e1e7dee28f4dada00807205958bef7886c8eefeb06c8b0ae17e9 Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.809015 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-696cp" event={"ID":"46ccad3f-3977-49ba-93a9-157d2fe8aa84","Type":"ContainerStarted","Data":"912cea66fd171c516b82815dad5f7764b22b822b13bb5b8c2195c790b259af1d"} Feb 28 03:53:53 crc kubenswrapper[4819]: I0228 03:53:53.810556 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" event={"ID":"69910ece-3df8-46e9-96c0-329dc732e16e","Type":"ContainerStarted","Data":"e093dd018a61e1e7dee28f4dada00807205958bef7886c8eefeb06c8b0ae17e9"} Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.382813 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e68315-57a9-416c-9c98-83e30d943815" path="/var/lib/kubelet/pods/13e68315-57a9-416c-9c98-83e30d943815/volumes" Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.384310 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4427ba01-e5bc-4dc0-a501-47576384ba41" path="/var/lib/kubelet/pods/4427ba01-e5bc-4dc0-a501-47576384ba41/volumes" Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.385375 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52a10b3-813e-439c-b3f6-1a5970cd5eec" path="/var/lib/kubelet/pods/f52a10b3-813e-439c-b3f6-1a5970cd5eec/volumes" Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.553231 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.556972 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.821527 4819 generic.go:334] "Generic (PLEG): container finished" podID="69910ece-3df8-46e9-96c0-329dc732e16e" containerID="464d5583c42f5212a4c750a4e033617e75baf588a143582bfc7096ea9941af2e" exitCode=0 Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.821595 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" event={"ID":"69910ece-3df8-46e9-96c0-329dc732e16e","Type":"ContainerDied","Data":"464d5583c42f5212a4c750a4e033617e75baf588a143582bfc7096ea9941af2e"} Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.824706 4819 generic.go:334] "Generic (PLEG): container finished" podID="46ccad3f-3977-49ba-93a9-157d2fe8aa84" containerID="b616418b8ed847a1deb51307c4e891033ecfc2ecf69f98056fb96dc67e3b5de4" exitCode=0 Feb 28 03:53:54 crc kubenswrapper[4819]: I0228 03:53:54.824775 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-696cp" event={"ID":"46ccad3f-3977-49ba-93a9-157d2fe8aa84","Type":"ContainerDied","Data":"b616418b8ed847a1deb51307c4e891033ecfc2ecf69f98056fb96dc67e3b5de4"} Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.255754 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.259285 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.335685 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69910ece-3df8-46e9-96c0-329dc732e16e-operator-scripts\") pod \"69910ece-3df8-46e9-96c0-329dc732e16e\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.335835 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ccad3f-3977-49ba-93a9-157d2fe8aa84-operator-scripts\") pod \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.335872 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jz8\" (UniqueName: \"kubernetes.io/projected/46ccad3f-3977-49ba-93a9-157d2fe8aa84-kube-api-access-k5jz8\") pod \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\" (UID: \"46ccad3f-3977-49ba-93a9-157d2fe8aa84\") " Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.335970 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5p6d\" (UniqueName: \"kubernetes.io/projected/69910ece-3df8-46e9-96c0-329dc732e16e-kube-api-access-q5p6d\") pod \"69910ece-3df8-46e9-96c0-329dc732e16e\" (UID: \"69910ece-3df8-46e9-96c0-329dc732e16e\") " Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.337118 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69910ece-3df8-46e9-96c0-329dc732e16e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69910ece-3df8-46e9-96c0-329dc732e16e" (UID: "69910ece-3df8-46e9-96c0-329dc732e16e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.337181 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46ccad3f-3977-49ba-93a9-157d2fe8aa84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46ccad3f-3977-49ba-93a9-157d2fe8aa84" (UID: "46ccad3f-3977-49ba-93a9-157d2fe8aa84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.344659 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ccad3f-3977-49ba-93a9-157d2fe8aa84-kube-api-access-k5jz8" (OuterVolumeSpecName: "kube-api-access-k5jz8") pod "46ccad3f-3977-49ba-93a9-157d2fe8aa84" (UID: "46ccad3f-3977-49ba-93a9-157d2fe8aa84"). InnerVolumeSpecName "kube-api-access-k5jz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.345023 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69910ece-3df8-46e9-96c0-329dc732e16e-kube-api-access-q5p6d" (OuterVolumeSpecName: "kube-api-access-q5p6d") pod "69910ece-3df8-46e9-96c0-329dc732e16e" (UID: "69910ece-3df8-46e9-96c0-329dc732e16e"). InnerVolumeSpecName "kube-api-access-q5p6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.438492 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ccad3f-3977-49ba-93a9-157d2fe8aa84-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.438537 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jz8\" (UniqueName: \"kubernetes.io/projected/46ccad3f-3977-49ba-93a9-157d2fe8aa84-kube-api-access-k5jz8\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.438559 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5p6d\" (UniqueName: \"kubernetes.io/projected/69910ece-3df8-46e9-96c0-329dc732e16e-kube-api-access-q5p6d\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.438576 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69910ece-3df8-46e9-96c0-329dc732e16e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.848461 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-696cp" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.848456 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-696cp" event={"ID":"46ccad3f-3977-49ba-93a9-157d2fe8aa84","Type":"ContainerDied","Data":"912cea66fd171c516b82815dad5f7764b22b822b13bb5b8c2195c790b259af1d"} Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.849467 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912cea66fd171c516b82815dad5f7764b22b822b13bb5b8c2195c790b259af1d" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.851035 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" event={"ID":"69910ece-3df8-46e9-96c0-329dc732e16e","Type":"ContainerDied","Data":"e093dd018a61e1e7dee28f4dada00807205958bef7886c8eefeb06c8b0ae17e9"} Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.851086 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e093dd018a61e1e7dee28f4dada00807205958bef7886c8eefeb06c8b0ae17e9" Feb 28 03:53:56 crc kubenswrapper[4819]: I0228 03:53:56.851139 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.317386 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-lxq6q"] Feb 28 03:53:58 crc kubenswrapper[4819]: E0228 03:53:58.317769 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69910ece-3df8-46e9-96c0-329dc732e16e" containerName="mariadb-account-create-update" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.317786 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="69910ece-3df8-46e9-96c0-329dc732e16e" containerName="mariadb-account-create-update" Feb 28 03:53:58 crc kubenswrapper[4819]: E0228 03:53:58.317811 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ccad3f-3977-49ba-93a9-157d2fe8aa84" containerName="mariadb-database-create" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.317820 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ccad3f-3977-49ba-93a9-157d2fe8aa84" containerName="mariadb-database-create" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.317986 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ccad3f-3977-49ba-93a9-157d2fe8aa84" containerName="mariadb-database-create" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.318003 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="69910ece-3df8-46e9-96c0-329dc732e16e" containerName="mariadb-account-create-update" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.318631 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.321551 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-lt68b" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.322193 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.324661 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-lxq6q"] Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.372845 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-db-sync-config-data\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.372889 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-combined-ca-bundle\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.372953 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmcw\" (UniqueName: \"kubernetes.io/projected/f1354046-3b03-4797-b2cd-282709ec0186-kube-api-access-9xmcw\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.474180 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmcw\" (UniqueName: \"kubernetes.io/projected/f1354046-3b03-4797-b2cd-282709ec0186-kube-api-access-9xmcw\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.474535 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-db-sync-config-data\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.474635 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-combined-ca-bundle\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.479531 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-combined-ca-bundle\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.491128 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-db-sync-config-data\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.506657 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmcw\" (UniqueName: \"kubernetes.io/projected/f1354046-3b03-4797-b2cd-282709ec0186-kube-api-access-9xmcw\") pod \"barbican-db-sync-lxq6q\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.669833 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:53:58 crc kubenswrapper[4819]: I0228 03:53:58.934842 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-lxq6q"] Feb 28 03:53:58 crc kubenswrapper[4819]: W0228 03:53:58.941007 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1354046_3b03_4797_b2cd_282709ec0186.slice/crio-5ea6f5ff3752f4af4e75a36aa5a9f502e57f751fdfc7b091021a613c25709d9d WatchSource:0}: Error finding container 5ea6f5ff3752f4af4e75a36aa5a9f502e57f751fdfc7b091021a613c25709d9d: Status 404 returned error can't find the container with id 5ea6f5ff3752f4af4e75a36aa5a9f502e57f751fdfc7b091021a613c25709d9d Feb 28 03:53:59 crc kubenswrapper[4819]: I0228 03:53:59.535735 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:53:59 crc kubenswrapper[4819]: I0228 03:53:59.542761 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:53:59 crc kubenswrapper[4819]: I0228 03:53:59.879517 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" event={"ID":"f1354046-3b03-4797-b2cd-282709ec0186","Type":"ContainerStarted","Data":"5ea6f5ff3752f4af4e75a36aa5a9f502e57f751fdfc7b091021a613c25709d9d"} Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.128762 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537514-gx2vz"] Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.129515 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.133483 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.133632 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.133712 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.144162 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537514-gx2vz"] Feb 28 03:54:00 crc kubenswrapper[4819]: I0228 03:54:00.198119 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfrlh\" (UniqueName: \"kubernetes.io/projected/26b9d23e-ecf4-482b-ac4f-6c7585b30b54-kube-api-access-jfrlh\") pod \"auto-csr-approver-29537514-gx2vz\" (UID: \"26b9d23e-ecf4-482b-ac4f-6c7585b30b54\") " pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:00.299890 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfrlh\" (UniqueName: \"kubernetes.io/projected/26b9d23e-ecf4-482b-ac4f-6c7585b30b54-kube-api-access-jfrlh\") pod \"auto-csr-approver-29537514-gx2vz\" (UID: \"26b9d23e-ecf4-482b-ac4f-6c7585b30b54\") " pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:00.327366 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfrlh\" (UniqueName: \"kubernetes.io/projected/26b9d23e-ecf4-482b-ac4f-6c7585b30b54-kube-api-access-jfrlh\") pod \"auto-csr-approver-29537514-gx2vz\" (UID: \"26b9d23e-ecf4-482b-ac4f-6c7585b30b54\") " pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:00.450046 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:01.594358 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537514-gx2vz"] Feb 28 03:54:01 crc kubenswrapper[4819]: W0228 03:54:01.610575 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26b9d23e_ecf4_482b_ac4f_6c7585b30b54.slice/crio-176af7882337021a1a328d373d5aa494d5e2b88d9b55b86520a94cd0221a3793 WatchSource:0}: Error finding container 176af7882337021a1a328d373d5aa494d5e2b88d9b55b86520a94cd0221a3793: Status 404 returned error can't find the container with id 176af7882337021a1a328d373d5aa494d5e2b88d9b55b86520a94cd0221a3793 Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:01.898503 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" event={"ID":"26b9d23e-ecf4-482b-ac4f-6c7585b30b54","Type":"ContainerStarted","Data":"176af7882337021a1a328d373d5aa494d5e2b88d9b55b86520a94cd0221a3793"} Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:01.900896 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" event={"ID":"f1354046-3b03-4797-b2cd-282709ec0186","Type":"ContainerStarted","Data":"e817a969209f281910e85697a63def6e468220d0e416fafb41e938a5c42100da"} Feb 28 03:54:01 crc kubenswrapper[4819]: I0228 03:54:01.930607 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" podStartSLOduration=3.930567119 podStartE2EDuration="3.930567119s" podCreationTimestamp="2026-02-28 03:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:54:01.924170872 +0000 UTC m=+1180.389739780" watchObservedRunningTime="2026-02-28 03:54:01.930567119 +0000 UTC m=+1180.396136017" Feb 28 03:54:02 crc kubenswrapper[4819]: I0228 03:54:02.910058 4819 generic.go:334] "Generic (PLEG): container finished" podID="f1354046-3b03-4797-b2cd-282709ec0186" containerID="e817a969209f281910e85697a63def6e468220d0e416fafb41e938a5c42100da" exitCode=0 Feb 28 03:54:02 crc kubenswrapper[4819]: I0228 03:54:02.910275 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" event={"ID":"f1354046-3b03-4797-b2cd-282709ec0186","Type":"ContainerDied","Data":"e817a969209f281910e85697a63def6e468220d0e416fafb41e938a5c42100da"} Feb 28 03:54:03 crc kubenswrapper[4819]: I0228 03:54:03.920623 4819 generic.go:334] "Generic (PLEG): container finished" podID="26b9d23e-ecf4-482b-ac4f-6c7585b30b54" containerID="b886ff7654199bfd530b5f29bcd25710ace87e00b445783c0805ed96066e290a" exitCode=0 Feb 28 03:54:03 crc kubenswrapper[4819]: I0228 03:54:03.920727 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" event={"ID":"26b9d23e-ecf4-482b-ac4f-6c7585b30b54","Type":"ContainerDied","Data":"b886ff7654199bfd530b5f29bcd25710ace87e00b445783c0805ed96066e290a"} Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.278897 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.372770 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xmcw\" (UniqueName: \"kubernetes.io/projected/f1354046-3b03-4797-b2cd-282709ec0186-kube-api-access-9xmcw\") pod \"f1354046-3b03-4797-b2cd-282709ec0186\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.372811 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-db-sync-config-data\") pod \"f1354046-3b03-4797-b2cd-282709ec0186\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.372872 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-combined-ca-bundle\") pod \"f1354046-3b03-4797-b2cd-282709ec0186\" (UID: \"f1354046-3b03-4797-b2cd-282709ec0186\") " Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.374197 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.378050 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f1354046-3b03-4797-b2cd-282709ec0186" (UID: "f1354046-3b03-4797-b2cd-282709ec0186"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.378486 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1354046-3b03-4797-b2cd-282709ec0186-kube-api-access-9xmcw" (OuterVolumeSpecName: "kube-api-access-9xmcw") pod "f1354046-3b03-4797-b2cd-282709ec0186" (UID: "f1354046-3b03-4797-b2cd-282709ec0186"). InnerVolumeSpecName "kube-api-access-9xmcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.392704 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1354046-3b03-4797-b2cd-282709ec0186" (UID: "f1354046-3b03-4797-b2cd-282709ec0186"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.463881 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.474847 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xmcw\" (UniqueName: \"kubernetes.io/projected/f1354046-3b03-4797-b2cd-282709ec0186-kube-api-access-9xmcw\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.475101 4819 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.475121 4819 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1354046-3b03-4797-b2cd-282709ec0186-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.931188 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.931216 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-lxq6q" event={"ID":"f1354046-3b03-4797-b2cd-282709ec0186","Type":"ContainerDied","Data":"5ea6f5ff3752f4af4e75a36aa5a9f502e57f751fdfc7b091021a613c25709d9d"} Feb 28 03:54:04 crc kubenswrapper[4819]: I0228 03:54:04.933356 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea6f5ff3752f4af4e75a36aa5a9f502e57f751fdfc7b091021a613c25709d9d" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.299925 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.391008 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfrlh\" (UniqueName: \"kubernetes.io/projected/26b9d23e-ecf4-482b-ac4f-6c7585b30b54-kube-api-access-jfrlh\") pod \"26b9d23e-ecf4-482b-ac4f-6c7585b30b54\" (UID: \"26b9d23e-ecf4-482b-ac4f-6c7585b30b54\") " Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.395011 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b9d23e-ecf4-482b-ac4f-6c7585b30b54-kube-api-access-jfrlh" (OuterVolumeSpecName: "kube-api-access-jfrlh") pod "26b9d23e-ecf4-482b-ac4f-6c7585b30b54" (UID: "26b9d23e-ecf4-482b-ac4f-6c7585b30b54"). InnerVolumeSpecName "kube-api-access-jfrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.493220 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfrlh\" (UniqueName: \"kubernetes.io/projected/26b9d23e-ecf4-482b-ac4f-6c7585b30b54-kube-api-access-jfrlh\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.516333 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf"] Feb 28 03:54:05 crc kubenswrapper[4819]: E0228 03:54:05.516655 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b9d23e-ecf4-482b-ac4f-6c7585b30b54" containerName="oc" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.516672 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b9d23e-ecf4-482b-ac4f-6c7585b30b54" containerName="oc" Feb 28 03:54:05 crc kubenswrapper[4819]: E0228 03:54:05.516691 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1354046-3b03-4797-b2cd-282709ec0186" containerName="barbican-db-sync" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.516699 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1354046-3b03-4797-b2cd-282709ec0186" containerName="barbican-db-sync" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.516811 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1354046-3b03-4797-b2cd-282709ec0186" containerName="barbican-db-sync" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.516829 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b9d23e-ecf4-482b-ac4f-6c7585b30b54" containerName="oc" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.517553 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.520551 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.520806 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-lt68b" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.525903 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2"] Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.526966 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.536484 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2"] Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.543584 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf"] Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594180 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g6tk\" (UniqueName: \"kubernetes.io/projected/78c61cde-2f52-4c63-a78f-683011ce9a51-kube-api-access-5g6tk\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594233 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9184103b-f4bc-413e-adda-26d6e0da77a3-logs\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594273 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data-custom\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594291 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c61cde-2f52-4c63-a78f-683011ce9a51-logs\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594323 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5kp\" (UniqueName: \"kubernetes.io/projected/9184103b-f4bc-413e-adda-26d6e0da77a3-kube-api-access-kq5kp\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594365 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594378 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data-custom\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594478 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594505 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-combined-ca-bundle\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.594539 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-combined-ca-bundle\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.606770 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq"] Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.607819 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.610214 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-internal-svc" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.610872 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-public-svc" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.621964 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq"] Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696160 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-public-tls-certs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696212 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g6tk\" (UniqueName: \"kubernetes.io/projected/78c61cde-2f52-4c63-a78f-683011ce9a51-kube-api-access-5g6tk\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696234 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-internal-tls-certs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696269 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9184103b-f4bc-413e-adda-26d6e0da77a3-logs\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696299 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data-custom\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696318 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c61cde-2f52-4c63-a78f-683011ce9a51-logs\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696340 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47699c41-4940-4aec-9644-93dfea90094b-logs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696362 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5kp\" (UniqueName: \"kubernetes.io/projected/9184103b-f4bc-413e-adda-26d6e0da77a3-kube-api-access-kq5kp\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696385 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696411 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-combined-ca-bundle\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696433 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696449 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data-custom\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696469 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696486 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-combined-ca-bundle\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696502 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xm2r\" (UniqueName: \"kubernetes.io/projected/47699c41-4940-4aec-9644-93dfea90094b-kube-api-access-5xm2r\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696523 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data-custom\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696547 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-combined-ca-bundle\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.696967 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c61cde-2f52-4c63-a78f-683011ce9a51-logs\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.697701 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9184103b-f4bc-413e-adda-26d6e0da77a3-logs\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.703507 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-combined-ca-bundle\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.703655 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data-custom\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.703842 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data-custom\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.704109 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.704163 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.704464 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-combined-ca-bundle\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.716648 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5kp\" (UniqueName: \"kubernetes.io/projected/9184103b-f4bc-413e-adda-26d6e0da77a3-kube-api-access-kq5kp\") pod \"barbican-keystone-listener-9485f4fb6-66mzf\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.717460 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g6tk\" (UniqueName: \"kubernetes.io/projected/78c61cde-2f52-4c63-a78f-683011ce9a51-kube-api-access-5g6tk\") pod \"barbican-worker-5b659b5b6c-x9mm2\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797586 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xm2r\" (UniqueName: \"kubernetes.io/projected/47699c41-4940-4aec-9644-93dfea90094b-kube-api-access-5xm2r\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797634 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data-custom\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797665 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-public-tls-certs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797687 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-internal-tls-certs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797720 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47699c41-4940-4aec-9644-93dfea90094b-logs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797751 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.797777 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-combined-ca-bundle\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.798857 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47699c41-4940-4aec-9644-93dfea90094b-logs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.801122 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-combined-ca-bundle\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.801654 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-public-tls-certs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.802793 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-internal-tls-certs\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.803518 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data-custom\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.804454 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.814725 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xm2r\" (UniqueName: \"kubernetes.io/projected/47699c41-4940-4aec-9644-93dfea90094b-kube-api-access-5xm2r\") pod \"barbican-api-7bb8b6db58-6qnzq\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.836652 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.844389 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.924132 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.948348 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" event={"ID":"26b9d23e-ecf4-482b-ac4f-6c7585b30b54","Type":"ContainerDied","Data":"176af7882337021a1a328d373d5aa494d5e2b88d9b55b86520a94cd0221a3793"} Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.948384 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176af7882337021a1a328d373d5aa494d5e2b88d9b55b86520a94cd0221a3793" Feb 28 03:54:05 crc kubenswrapper[4819]: I0228 03:54:05.948455 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537514-gx2vz" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.297648 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2"] Feb 28 03:54:06 crc kubenswrapper[4819]: W0228 03:54:06.311386 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c61cde_2f52_4c63_a78f_683011ce9a51.slice/crio-82790330c95ec9359b8cd864485885182ad3a2686adc79178c827cf32b7e8df8 WatchSource:0}: Error finding container 82790330c95ec9359b8cd864485885182ad3a2686adc79178c827cf32b7e8df8: Status 404 returned error can't find the container with id 82790330c95ec9359b8cd864485885182ad3a2686adc79178c827cf32b7e8df8 Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.408417 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537508-7zh58"] Feb 28 03:54:06 crc kubenswrapper[4819]: W0228 03:54:06.415195 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9184103b_f4bc_413e_adda_26d6e0da77a3.slice/crio-32b365d18f35d1a9bdcf27d79df5bf1881a1e205825256f683a90ee8c974391e WatchSource:0}: Error finding container 32b365d18f35d1a9bdcf27d79df5bf1881a1e205825256f683a90ee8c974391e: Status 404 returned error can't find the container with id 32b365d18f35d1a9bdcf27d79df5bf1881a1e205825256f683a90ee8c974391e Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.416955 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.426347 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537508-7zh58"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.438560 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-lxq6q"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.440998 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-lxq6q"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.474385 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.498870 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.520305 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.532526 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbicane8ca-account-delete-kl7km"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.533609 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.544174 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.558101 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicane8ca-account-delete-kl7km"] Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.610944 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ljdm\" (UniqueName: \"kubernetes.io/projected/35ad70b3-4188-42eb-9314-c606f72b4534-kube-api-access-2ljdm\") pod \"barbicane8ca-account-delete-kl7km\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.611406 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ad70b3-4188-42eb-9314-c606f72b4534-operator-scripts\") pod \"barbicane8ca-account-delete-kl7km\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.712642 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ad70b3-4188-42eb-9314-c606f72b4534-operator-scripts\") pod \"barbicane8ca-account-delete-kl7km\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.712694 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ljdm\" (UniqueName: \"kubernetes.io/projected/35ad70b3-4188-42eb-9314-c606f72b4534-kube-api-access-2ljdm\") pod \"barbicane8ca-account-delete-kl7km\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.713615 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ad70b3-4188-42eb-9314-c606f72b4534-operator-scripts\") pod \"barbicane8ca-account-delete-kl7km\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.736659 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ljdm\" (UniqueName: \"kubernetes.io/projected/35ad70b3-4188-42eb-9314-c606f72b4534-kube-api-access-2ljdm\") pod \"barbicane8ca-account-delete-kl7km\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.870189 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.987358 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" event={"ID":"47699c41-4940-4aec-9644-93dfea90094b","Type":"ContainerStarted","Data":"f8ee5e7a6b2b916ff363d2593c6593e161dc32b17d9b1ea1984733a15f19ec89"} Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.989844 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" event={"ID":"47699c41-4940-4aec-9644-93dfea90094b","Type":"ContainerStarted","Data":"6adc800411b8ab1df635dea464b97e6db4b232e8cb5638bb5853e5275d1a6640"} Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.989969 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" event={"ID":"47699c41-4940-4aec-9644-93dfea90094b","Type":"ContainerStarted","Data":"cc38fcb915b2a84cfaef38583d88a0f899b77203c25ba4d3b76da3ffa0d1ccd0"} Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.990401 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" containerID="cri-o://6adc800411b8ab1df635dea464b97e6db4b232e8cb5638bb5853e5275d1a6640" gracePeriod=30 Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.990611 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.990720 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" containerID="cri-o://f8ee5e7a6b2b916ff363d2593c6593e161dc32b17d9b1ea1984733a15f19ec89" gracePeriod=30 Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.993670 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" event={"ID":"78c61cde-2f52-4c63-a78f-683011ce9a51","Type":"ContainerStarted","Data":"8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9"} Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.993707 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" event={"ID":"78c61cde-2f52-4c63-a78f-683011ce9a51","Type":"ContainerStarted","Data":"49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274"} Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.993733 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" event={"ID":"78c61cde-2f52-4c63-a78f-683011ce9a51","Type":"ContainerStarted","Data":"82790330c95ec9359b8cd864485885182ad3a2686adc79178c827cf32b7e8df8"} Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.993810 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker-log" containerID="cri-o://49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274" gracePeriod=30 Feb 28 03:54:06 crc kubenswrapper[4819]: I0228 03:54:06.993859 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker" containerID="cri-o://8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9" gracePeriod=30 Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.002066 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" event={"ID":"9184103b-f4bc-413e-adda-26d6e0da77a3","Type":"ContainerStarted","Data":"e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397"} Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.002110 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" event={"ID":"9184103b-f4bc-413e-adda-26d6e0da77a3","Type":"ContainerStarted","Data":"3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e"} Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.002121 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" event={"ID":"9184103b-f4bc-413e-adda-26d6e0da77a3","Type":"ContainerStarted","Data":"32b365d18f35d1a9bdcf27d79df5bf1881a1e205825256f683a90ee8c974391e"} Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.002230 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener-log" containerID="cri-o://3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e" gracePeriod=30 Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.002330 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener" containerID="cri-o://e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397" gracePeriod=30 Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.018627 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podStartSLOduration=2.018612276 podStartE2EDuration="2.018612276s" podCreationTimestamp="2026-02-28 03:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:54:07.017760346 +0000 UTC m=+1185.483329204" watchObservedRunningTime="2026-02-28 03:54:07.018612276 +0000 UTC m=+1185.484181124" Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.065502 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" podStartSLOduration=2.06548216 podStartE2EDuration="2.06548216s" podCreationTimestamp="2026-02-28 03:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:54:07.037600544 +0000 UTC m=+1185.503169402" watchObservedRunningTime="2026-02-28 03:54:07.06548216 +0000 UTC m=+1185.531051018" Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.067967 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" podStartSLOduration=2.067961451 podStartE2EDuration="2.067961451s" podCreationTimestamp="2026-02-28 03:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:54:07.058767905 +0000 UTC m=+1185.524336763" watchObservedRunningTime="2026-02-28 03:54:07.067961451 +0000 UTC m=+1185.533530309" Feb 28 03:54:07 crc kubenswrapper[4819]: W0228 03:54:07.326580 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ad70b3_4188_42eb_9314_c606f72b4534.slice/crio-3083cd7abdfeef78c1a068bec96d08b40bf17c3884d82e9c631593853d37c5fa WatchSource:0}: Error finding container 3083cd7abdfeef78c1a068bec96d08b40bf17c3884d82e9c631593853d37c5fa: Status 404 returned error can't find the container with id 3083cd7abdfeef78c1a068bec96d08b40bf17c3884d82e9c631593853d37c5fa Feb 28 03:54:07 crc kubenswrapper[4819]: I0228 03:54:07.334980 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicane8ca-account-delete-kl7km"] Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.009828 4819 generic.go:334] "Generic (PLEG): container finished" podID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerID="3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e" exitCode=143 Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.009846 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" event={"ID":"9184103b-f4bc-413e-adda-26d6e0da77a3","Type":"ContainerDied","Data":"3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e"} Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.012466 4819 generic.go:334] "Generic (PLEG): container finished" podID="47699c41-4940-4aec-9644-93dfea90094b" containerID="6adc800411b8ab1df635dea464b97e6db4b232e8cb5638bb5853e5275d1a6640" exitCode=143 Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.012504 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" event={"ID":"47699c41-4940-4aec-9644-93dfea90094b","Type":"ContainerDied","Data":"6adc800411b8ab1df635dea464b97e6db4b232e8cb5638bb5853e5275d1a6640"} Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.013958 4819 generic.go:334] "Generic (PLEG): container finished" podID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerID="49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274" exitCode=143 Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.013995 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" event={"ID":"78c61cde-2f52-4c63-a78f-683011ce9a51","Type":"ContainerDied","Data":"49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274"} Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.014997 4819 generic.go:334] "Generic (PLEG): container finished" podID="35ad70b3-4188-42eb-9314-c606f72b4534" containerID="45670fb02aacf1f22e85015f74089337d3b67b60a37dfd9dc68b4e9d9beba427" exitCode=0 Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.015019 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" event={"ID":"35ad70b3-4188-42eb-9314-c606f72b4534","Type":"ContainerDied","Data":"45670fb02aacf1f22e85015f74089337d3b67b60a37dfd9dc68b4e9d9beba427"} Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.015032 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" event={"ID":"35ad70b3-4188-42eb-9314-c606f72b4534","Type":"ContainerStarted","Data":"3083cd7abdfeef78c1a068bec96d08b40bf17c3884d82e9c631593853d37c5fa"} Feb 28 03:54:08 crc kubenswrapper[4819]: E0228 03:54:08.259072 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c61cde_2f52_4c63_a78f_683011ce9a51.slice/crio-conmon-8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9.scope\": RecentStats: unable to find data in memory cache]" Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.379956 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cab466-bfa3-441b-8c96-f1e3d4b3b4f9" path="/var/lib/kubelet/pods/68cab466-bfa3-441b-8c96-f1e3d4b3b4f9/volumes" Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.381558 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1354046-3b03-4797-b2cd-282709ec0186" path="/var/lib/kubelet/pods/f1354046-3b03-4797-b2cd-282709ec0186/volumes" Feb 28 03:54:08 crc kubenswrapper[4819]: I0228 03:54:08.599870 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.605861 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642501 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data-custom\") pod \"78c61cde-2f52-4c63-a78f-683011ce9a51\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642563 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data\") pod \"9184103b-f4bc-413e-adda-26d6e0da77a3\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642585 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data\") pod \"78c61cde-2f52-4c63-a78f-683011ce9a51\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642627 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9184103b-f4bc-413e-adda-26d6e0da77a3-logs\") pod \"9184103b-f4bc-413e-adda-26d6e0da77a3\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642656 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-combined-ca-bundle\") pod \"78c61cde-2f52-4c63-a78f-683011ce9a51\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642682 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data-custom\") pod \"9184103b-f4bc-413e-adda-26d6e0da77a3\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642716 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g6tk\" (UniqueName: \"kubernetes.io/projected/78c61cde-2f52-4c63-a78f-683011ce9a51-kube-api-access-5g6tk\") pod \"78c61cde-2f52-4c63-a78f-683011ce9a51\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642740 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c61cde-2f52-4c63-a78f-683011ce9a51-logs\") pod \"78c61cde-2f52-4c63-a78f-683011ce9a51\" (UID: \"78c61cde-2f52-4c63-a78f-683011ce9a51\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642769 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-combined-ca-bundle\") pod \"9184103b-f4bc-413e-adda-26d6e0da77a3\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.642830 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq5kp\" (UniqueName: \"kubernetes.io/projected/9184103b-f4bc-413e-adda-26d6e0da77a3-kube-api-access-kq5kp\") pod \"9184103b-f4bc-413e-adda-26d6e0da77a3\" (UID: \"9184103b-f4bc-413e-adda-26d6e0da77a3\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.645889 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9184103b-f4bc-413e-adda-26d6e0da77a3-logs" (OuterVolumeSpecName: "logs") pod "9184103b-f4bc-413e-adda-26d6e0da77a3" (UID: "9184103b-f4bc-413e-adda-26d6e0da77a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.646127 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c61cde-2f52-4c63-a78f-683011ce9a51-logs" (OuterVolumeSpecName: "logs") pod "78c61cde-2f52-4c63-a78f-683011ce9a51" (UID: "78c61cde-2f52-4c63-a78f-683011ce9a51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.653454 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "78c61cde-2f52-4c63-a78f-683011ce9a51" (UID: "78c61cde-2f52-4c63-a78f-683011ce9a51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.655500 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9184103b-f4bc-413e-adda-26d6e0da77a3-kube-api-access-kq5kp" (OuterVolumeSpecName: "kube-api-access-kq5kp") pod "9184103b-f4bc-413e-adda-26d6e0da77a3" (UID: "9184103b-f4bc-413e-adda-26d6e0da77a3"). InnerVolumeSpecName "kube-api-access-kq5kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.669405 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c61cde-2f52-4c63-a78f-683011ce9a51-kube-api-access-5g6tk" (OuterVolumeSpecName: "kube-api-access-5g6tk") pod "78c61cde-2f52-4c63-a78f-683011ce9a51" (UID: "78c61cde-2f52-4c63-a78f-683011ce9a51"). InnerVolumeSpecName "kube-api-access-5g6tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.679523 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9184103b-f4bc-413e-adda-26d6e0da77a3" (UID: "9184103b-f4bc-413e-adda-26d6e0da77a3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.688337 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9184103b-f4bc-413e-adda-26d6e0da77a3" (UID: "9184103b-f4bc-413e-adda-26d6e0da77a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.694221 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78c61cde-2f52-4c63-a78f-683011ce9a51" (UID: "78c61cde-2f52-4c63-a78f-683011ce9a51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.702200 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data" (OuterVolumeSpecName: "config-data") pod "9184103b-f4bc-413e-adda-26d6e0da77a3" (UID: "9184103b-f4bc-413e-adda-26d6e0da77a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.704413 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data" (OuterVolumeSpecName: "config-data") pod "78c61cde-2f52-4c63-a78f-683011ce9a51" (UID: "78c61cde-2f52-4c63-a78f-683011ce9a51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744274 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744300 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744308 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744318 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9184103b-f4bc-413e-adda-26d6e0da77a3-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744327 4819 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c61cde-2f52-4c63-a78f-683011ce9a51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744334 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744343 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g6tk\" (UniqueName: \"kubernetes.io/projected/78c61cde-2f52-4c63-a78f-683011ce9a51-kube-api-access-5g6tk\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744352 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c61cde-2f52-4c63-a78f-683011ce9a51-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744360 4819 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9184103b-f4bc-413e-adda-26d6e0da77a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:08.744369 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq5kp\" (UniqueName: \"kubernetes.io/projected/9184103b-f4bc-413e-adda-26d6e0da77a3-kube-api-access-kq5kp\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.021739 4819 generic.go:334] "Generic (PLEG): container finished" podID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerID="e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397" exitCode=1 Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.021793 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" event={"ID":"9184103b-f4bc-413e-adda-26d6e0da77a3","Type":"ContainerDied","Data":"e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397"} Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.021822 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" event={"ID":"9184103b-f4bc-413e-adda-26d6e0da77a3","Type":"ContainerDied","Data":"32b365d18f35d1a9bdcf27d79df5bf1881a1e205825256f683a90ee8c974391e"} Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.021839 4819 scope.go:117] "RemoveContainer" containerID="e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.021947 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.030489 4819 generic.go:334] "Generic (PLEG): container finished" podID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerID="8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9" exitCode=1 Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.030660 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.033239 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" event={"ID":"78c61cde-2f52-4c63-a78f-683011ce9a51","Type":"ContainerDied","Data":"8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9"} Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.033298 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2" event={"ID":"78c61cde-2f52-4c63-a78f-683011ce9a51","Type":"ContainerDied","Data":"82790330c95ec9359b8cd864485885182ad3a2686adc79178c827cf32b7e8df8"} Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.059916 4819 scope.go:117] "RemoveContainer" containerID="3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.076622 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf"] Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.076658 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-9485f4fb6-66mzf"] Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.078647 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2"] Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.082651 4819 scope.go:117] "RemoveContainer" containerID="e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397" Feb 28 03:54:09 crc kubenswrapper[4819]: E0228 03:54:09.083158 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397\": container with ID starting with e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397 not found: ID does not exist" containerID="e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.083183 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397"} err="failed to get container status \"e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397\": rpc error: code = NotFound desc = could not find container \"e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397\": container with ID starting with e4b970e571e394b0853b6e8a0ef31a3d7d4c787cf4b8dfc381d7c248ebac4397 not found: ID does not exist" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.083202 4819 scope.go:117] "RemoveContainer" containerID="3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e" Feb 28 03:54:09 crc kubenswrapper[4819]: E0228 03:54:09.083682 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e\": container with ID starting with 3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e not found: ID does not exist" containerID="3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.083742 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e"} err="failed to get container status \"3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e\": rpc error: code = NotFound desc = could not find container \"3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e\": container with ID starting with 3f7dc2818afb1a1e03580aab74390c3e6373375963ed128f6f2831acae58a81e not found: ID does not exist" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.083784 4819 scope.go:117] "RemoveContainer" containerID="8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.102877 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-5b659b5b6c-x9mm2"] Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.106273 4819 scope.go:117] "RemoveContainer" containerID="49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.131936 4819 scope.go:117] "RemoveContainer" containerID="8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9" Feb 28 03:54:09 crc kubenswrapper[4819]: E0228 03:54:09.132609 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9\": container with ID starting with 8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9 not found: ID does not exist" containerID="8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.132676 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9"} err="failed to get container status \"8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9\": rpc error: code = NotFound desc = could not find container \"8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9\": container with ID starting with 8317f020ca51da0c5f67c15dfb2d9cd77fbc7b5566988973d6b51bfe6f4553a9 not found: ID does not exist" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.132713 4819 scope.go:117] "RemoveContainer" containerID="49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274" Feb 28 03:54:09 crc kubenswrapper[4819]: E0228 03:54:09.133151 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274\": container with ID starting with 49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274 not found: ID does not exist" containerID="49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.133184 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274"} err="failed to get container status \"49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274\": rpc error: code = NotFound desc = could not find container \"49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274\": container with ID starting with 49f2c82caa3fd176d9ca4055a724855dd128fa80f8441ea28a2364200826b274 not found: ID does not exist" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.347643 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.387099 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.430457 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.454028 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ad70b3-4188-42eb-9314-c606f72b4534-operator-scripts\") pod \"35ad70b3-4188-42eb-9314-c606f72b4534\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.454105 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ljdm\" (UniqueName: \"kubernetes.io/projected/35ad70b3-4188-42eb-9314-c606f72b4534-kube-api-access-2ljdm\") pod \"35ad70b3-4188-42eb-9314-c606f72b4534\" (UID: \"35ad70b3-4188-42eb-9314-c606f72b4534\") " Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.455009 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ad70b3-4188-42eb-9314-c606f72b4534-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35ad70b3-4188-42eb-9314-c606f72b4534" (UID: "35ad70b3-4188-42eb-9314-c606f72b4534"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.458634 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ad70b3-4188-42eb-9314-c606f72b4534-kube-api-access-2ljdm" (OuterVolumeSpecName: "kube-api-access-2ljdm") pod "35ad70b3-4188-42eb-9314-c606f72b4534" (UID: "35ad70b3-4188-42eb-9314-c606f72b4534"). InnerVolumeSpecName "kube-api-access-2ljdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.556711 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ljdm\" (UniqueName: \"kubernetes.io/projected/35ad70b3-4188-42eb-9314-c606f72b4534-kube-api-access-2ljdm\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:09 crc kubenswrapper[4819]: I0228 03:54:09.556752 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35ad70b3-4188-42eb-9314-c606f72b4534-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:10 crc kubenswrapper[4819]: I0228 03:54:10.043963 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" Feb 28 03:54:10 crc kubenswrapper[4819]: I0228 03:54:10.043961 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicane8ca-account-delete-kl7km" event={"ID":"35ad70b3-4188-42eb-9314-c606f72b4534","Type":"ContainerDied","Data":"3083cd7abdfeef78c1a068bec96d08b40bf17c3884d82e9c631593853d37c5fa"} Feb 28 03:54:10 crc kubenswrapper[4819]: I0228 03:54:10.044032 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3083cd7abdfeef78c1a068bec96d08b40bf17c3884d82e9c631593853d37c5fa" Feb 28 03:54:10 crc kubenswrapper[4819]: I0228 03:54:10.384105 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" path="/var/lib/kubelet/pods/78c61cde-2f52-4c63-a78f-683011ce9a51/volumes" Feb 28 03:54:10 crc kubenswrapper[4819]: I0228 03:54:10.384767 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" path="/var/lib/kubelet/pods/9184103b-f4bc-413e-adda-26d6e0da77a3/volumes" Feb 28 03:54:10 crc kubenswrapper[4819]: I0228 03:54:10.925162 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:11 crc kubenswrapper[4819]: I0228 03:54:11.540345 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-696cp"] Feb 28 03:54:11 crc kubenswrapper[4819]: I0228 03:54:11.551454 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-696cp"] Feb 28 03:54:11 crc kubenswrapper[4819]: I0228 03:54:11.561628 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk"] Feb 28 03:54:11 crc kubenswrapper[4819]: I0228 03:54:11.570098 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbicane8ca-account-delete-kl7km"] Feb 28 03:54:11 crc kubenswrapper[4819]: I0228 03:54:11.577855 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-e8ca-account-create-update-8gtjk"] Feb 28 03:54:11 crc kubenswrapper[4819]: I0228 03:54:11.584884 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbicane8ca-account-delete-kl7km"] Feb 28 03:54:12 crc kubenswrapper[4819]: I0228 03:54:12.382015 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ad70b3-4188-42eb-9314-c606f72b4534" path="/var/lib/kubelet/pods/35ad70b3-4188-42eb-9314-c606f72b4534/volumes" Feb 28 03:54:12 crc kubenswrapper[4819]: I0228 03:54:12.383422 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ccad3f-3977-49ba-93a9-157d2fe8aa84" path="/var/lib/kubelet/pods/46ccad3f-3977-49ba-93a9-157d2fe8aa84/volumes" Feb 28 03:54:12 crc kubenswrapper[4819]: I0228 03:54:12.384327 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69910ece-3df8-46e9-96c0-329dc732e16e" path="/var/lib/kubelet/pods/69910ece-3df8-46e9-96c0-329dc732e16e/volumes" Feb 28 03:54:13 crc kubenswrapper[4819]: I0228 03:54:13.320100 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:13 crc kubenswrapper[4819]: I0228 03:54:13.326827 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.089605 4819 generic.go:334] "Generic (PLEG): container finished" podID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerID="85953e08ebfdef4466a3c04065f20174d6747c3e50fd306058993b49070a4421" exitCode=137 Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.089678 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" event={"ID":"08a26f1d-d188-41ce-8a4f-bc52a2d8492f","Type":"ContainerDied","Data":"85953e08ebfdef4466a3c04065f20174d6747c3e50fd306058993b49070a4421"} Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.155568 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.256092 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data\") pod \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.256149 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-logs\") pod \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.256227 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnnpg\" (UniqueName: \"kubernetes.io/projected/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-kube-api-access-lnnpg\") pod \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.256312 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data-custom\") pod \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\" (UID: \"08a26f1d-d188-41ce-8a4f-bc52a2d8492f\") " Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.256957 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-logs" (OuterVolumeSpecName: "logs") pod "08a26f1d-d188-41ce-8a4f-bc52a2d8492f" (UID: "08a26f1d-d188-41ce-8a4f-bc52a2d8492f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.263969 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-kube-api-access-lnnpg" (OuterVolumeSpecName: "kube-api-access-lnnpg") pod "08a26f1d-d188-41ce-8a4f-bc52a2d8492f" (UID: "08a26f1d-d188-41ce-8a4f-bc52a2d8492f"). InnerVolumeSpecName "kube-api-access-lnnpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.264781 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08a26f1d-d188-41ce-8a4f-bc52a2d8492f" (UID: "08a26f1d-d188-41ce-8a4f-bc52a2d8492f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.304136 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data" (OuterVolumeSpecName: "config-data") pod "08a26f1d-d188-41ce-8a4f-bc52a2d8492f" (UID: "08a26f1d-d188-41ce-8a4f-bc52a2d8492f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.357635 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.357674 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.357688 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:15 crc kubenswrapper[4819]: I0228 03:54:15.357700 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnnpg\" (UniqueName: \"kubernetes.io/projected/08a26f1d-d188-41ce-8a4f-bc52a2d8492f-kube-api-access-lnnpg\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.103674 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.103679 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz" event={"ID":"08a26f1d-d188-41ce-8a4f-bc52a2d8492f","Type":"ContainerDied","Data":"ae0b2568c716fda617f4a6d45e6a3a37c45c72246f37c450b591cf755d6f8a13"} Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.103843 4819 scope.go:117] "RemoveContainer" containerID="85953e08ebfdef4466a3c04065f20174d6747c3e50fd306058993b49070a4421" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.106206 4819 generic.go:334] "Generic (PLEG): container finished" podID="398688ee-ba77-4410-8294-42eaffb91650" containerID="389b6726bf5f90963538c33367844b38e39c3836d02eed00c327a21d4aeb46d8" exitCode=137 Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.106276 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" event={"ID":"398688ee-ba77-4410-8294-42eaffb91650","Type":"ContainerDied","Data":"389b6726bf5f90963538c33367844b38e39c3836d02eed00c327a21d4aeb46d8"} Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.106323 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" event={"ID":"398688ee-ba77-4410-8294-42eaffb91650","Type":"ContainerDied","Data":"8d9a94176a82fab753625a4eaa119da38625c376a63e494777974f1db1cf05e9"} Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.106338 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9a94176a82fab753625a4eaa119da38625c376a63e494777974f1db1cf05e9" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.118835 4819 generic.go:334] "Generic (PLEG): container finished" podID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerID="5d3a04cdef83005e050d26de1a862b569d03dfe3a5039255a27a4917f1cb7f7f" exitCode=137 Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.118883 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" event={"ID":"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c","Type":"ContainerDied","Data":"5d3a04cdef83005e050d26de1a862b569d03dfe3a5039255a27a4917f1cb7f7f"} Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.126703 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.139857 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.144837 4819 scope.go:117] "RemoveContainer" containerID="a5059f98626fd4d049ccd48c8fc212c6336a4c7f426f9ba0951eef14c493f5ba" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.145346 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz"] Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.152481 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-5654c9f87b-vgddz"] Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.275836 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-logs\") pod \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.275886 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8w2j\" (UniqueName: \"kubernetes.io/projected/398688ee-ba77-4410-8294-42eaffb91650-kube-api-access-b8w2j\") pod \"398688ee-ba77-4410-8294-42eaffb91650\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.275927 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxs9b\" (UniqueName: \"kubernetes.io/projected/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-kube-api-access-cxs9b\") pod \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.275945 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data-custom\") pod \"398688ee-ba77-4410-8294-42eaffb91650\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.275970 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data\") pod \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.276024 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data\") pod \"398688ee-ba77-4410-8294-42eaffb91650\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.276040 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data-custom\") pod \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\" (UID: \"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.276094 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398688ee-ba77-4410-8294-42eaffb91650-logs\") pod \"398688ee-ba77-4410-8294-42eaffb91650\" (UID: \"398688ee-ba77-4410-8294-42eaffb91650\") " Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.276627 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398688ee-ba77-4410-8294-42eaffb91650-logs" (OuterVolumeSpecName: "logs") pod "398688ee-ba77-4410-8294-42eaffb91650" (UID: "398688ee-ba77-4410-8294-42eaffb91650"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.276811 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-logs" (OuterVolumeSpecName: "logs") pod "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" (UID: "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.280201 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" (UID: "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.281746 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398688ee-ba77-4410-8294-42eaffb91650-kube-api-access-b8w2j" (OuterVolumeSpecName: "kube-api-access-b8w2j") pod "398688ee-ba77-4410-8294-42eaffb91650" (UID: "398688ee-ba77-4410-8294-42eaffb91650"). InnerVolumeSpecName "kube-api-access-b8w2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.293394 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-kube-api-access-cxs9b" (OuterVolumeSpecName: "kube-api-access-cxs9b") pod "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" (UID: "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c"). InnerVolumeSpecName "kube-api-access-cxs9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.299041 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "398688ee-ba77-4410-8294-42eaffb91650" (UID: "398688ee-ba77-4410-8294-42eaffb91650"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.306081 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data" (OuterVolumeSpecName: "config-data") pod "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" (UID: "e49c04aa-14d4-4d76-ae1f-73c8aa193b2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.321598 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data" (OuterVolumeSpecName: "config-data") pod "398688ee-ba77-4410-8294-42eaffb91650" (UID: "398688ee-ba77-4410-8294-42eaffb91650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.378618 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.378858 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8w2j\" (UniqueName: \"kubernetes.io/projected/398688ee-ba77-4410-8294-42eaffb91650-kube-api-access-b8w2j\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.378923 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxs9b\" (UniqueName: \"kubernetes.io/projected/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-kube-api-access-cxs9b\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.378982 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.379040 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.379132 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398688ee-ba77-4410-8294-42eaffb91650-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.379192 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.379269 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398688ee-ba77-4410-8294-42eaffb91650-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:16 crc kubenswrapper[4819]: I0228 03:54:16.383661 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" path="/var/lib/kubelet/pods/08a26f1d-d188-41ce-8a4f-bc52a2d8492f/volumes" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.132102 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.132162 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l" event={"ID":"e49c04aa-14d4-4d76-ae1f-73c8aa193b2c","Type":"ContainerDied","Data":"9015c2cbeefa07ac93b4e9e9324e3c9ca23925ee9207d178a21161ae0865312e"} Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.132299 4819 scope.go:117] "RemoveContainer" containerID="5d3a04cdef83005e050d26de1a862b569d03dfe3a5039255a27a4917f1cb7f7f" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.134558 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.166426 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.166646 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.174560 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg"] Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.182017 4819 scope.go:117] "RemoveContainer" containerID="ef2e85c4e5ffa8da0a9736d2baedadfe364b23842f758ba61c75bad084772775" Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.201334 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-5bdd47f94d-df2rg"] Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.212987 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l"] Feb 28 03:54:17 crc kubenswrapper[4819]: I0228 03:54:17.223191 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-64c846bbd9-7pv7l"] Feb 28 03:54:18 crc kubenswrapper[4819]: I0228 03:54:18.391887 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398688ee-ba77-4410-8294-42eaffb91650" path="/var/lib/kubelet/pods/398688ee-ba77-4410-8294-42eaffb91650/volumes" Feb 28 03:54:18 crc kubenswrapper[4819]: I0228 03:54:18.393406 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" path="/var/lib/kubelet/pods/e49c04aa-14d4-4d76-ae1f-73c8aa193b2c/volumes" Feb 28 03:54:22 crc kubenswrapper[4819]: I0228 03:54:22.139959 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:22 crc kubenswrapper[4819]: I0228 03:54:22.306137 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:27 crc kubenswrapper[4819]: I0228 03:54:27.116708 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:27 crc kubenswrapper[4819]: I0228 03:54:27.141587 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:32 crc kubenswrapper[4819]: I0228 03:54:32.104779 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:32 crc kubenswrapper[4819]: I0228 03:54:32.244741 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:36 crc kubenswrapper[4819]: I0228 03:54:36.022504 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:36 crc kubenswrapper[4819]: I0228 03:54:36.030589 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.330278 4819 generic.go:334] "Generic (PLEG): container finished" podID="47699c41-4940-4aec-9644-93dfea90094b" containerID="f8ee5e7a6b2b916ff363d2593c6593e161dc32b17d9b1ea1984733a15f19ec89" exitCode=137 Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.330453 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" event={"ID":"47699c41-4940-4aec-9644-93dfea90094b","Type":"ContainerDied","Data":"f8ee5e7a6b2b916ff363d2593c6593e161dc32b17d9b1ea1984733a15f19ec89"} Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.554369 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651089 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-combined-ca-bundle\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651149 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data-custom\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651186 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47699c41-4940-4aec-9644-93dfea90094b-logs\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651226 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-internal-tls-certs\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651270 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xm2r\" (UniqueName: \"kubernetes.io/projected/47699c41-4940-4aec-9644-93dfea90094b-kube-api-access-5xm2r\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651319 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.651370 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-public-tls-certs\") pod \"47699c41-4940-4aec-9644-93dfea90094b\" (UID: \"47699c41-4940-4aec-9644-93dfea90094b\") " Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.653102 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47699c41-4940-4aec-9644-93dfea90094b-logs" (OuterVolumeSpecName: "logs") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.658541 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.671919 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47699c41-4940-4aec-9644-93dfea90094b-kube-api-access-5xm2r" (OuterVolumeSpecName: "kube-api-access-5xm2r") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "kube-api-access-5xm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.687940 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.691696 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data" (OuterVolumeSpecName: "config-data") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.709391 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.714123 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47699c41-4940-4aec-9644-93dfea90094b" (UID: "47699c41-4940-4aec-9644-93dfea90094b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752599 4819 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752632 4819 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752642 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752653 4819 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47699c41-4940-4aec-9644-93dfea90094b-logs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752663 4819 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752672 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xm2r\" (UniqueName: \"kubernetes.io/projected/47699c41-4940-4aec-9644-93dfea90094b-kube-api-access-5xm2r\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:37 crc kubenswrapper[4819]: I0228 03:54:37.752683 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47699c41-4940-4aec-9644-93dfea90094b-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:38 crc kubenswrapper[4819]: I0228 03:54:38.345679 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" event={"ID":"47699c41-4940-4aec-9644-93dfea90094b","Type":"ContainerDied","Data":"cc38fcb915b2a84cfaef38583d88a0f899b77203c25ba4d3b76da3ffa0d1ccd0"} Feb 28 03:54:38 crc kubenswrapper[4819]: I0228 03:54:38.345774 4819 scope.go:117] "RemoveContainer" containerID="f8ee5e7a6b2b916ff363d2593c6593e161dc32b17d9b1ea1984733a15f19ec89" Feb 28 03:54:38 crc kubenswrapper[4819]: I0228 03:54:38.345810 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq" Feb 28 03:54:38 crc kubenswrapper[4819]: I0228 03:54:38.377147 4819 scope.go:117] "RemoveContainer" containerID="6adc800411b8ab1df635dea464b97e6db4b232e8cb5638bb5853e5275d1a6640" Feb 28 03:54:38 crc kubenswrapper[4819]: I0228 03:54:38.410963 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq"] Feb 28 03:54:38 crc kubenswrapper[4819]: I0228 03:54:38.422552 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-7bb8b6db58-6qnzq"] Feb 28 03:54:40 crc kubenswrapper[4819]: I0228 03:54:40.382631 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47699c41-4940-4aec-9644-93dfea90094b" path="/var/lib/kubelet/pods/47699c41-4940-4aec-9644-93dfea90094b/volumes" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.222002 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-lm96w"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.231447 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-p4869"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.240234 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-p4869"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.253148 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-lm96w"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.258398 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.258642 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" podUID="c66de22e-9900-45fe-b074-455829b4084a" containerName="keystone-api" containerID="cri-o://f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259" gracePeriod=30 Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264429 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystoneee83-account-delete-6nzk7"] Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264781 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264793 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264801 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264807 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264820 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264827 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264846 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264852 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264867 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264873 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264880 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264886 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener-log" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264895 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264900 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker-log" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264913 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264918 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264925 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264932 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener-log" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264941 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ad70b3-4188-42eb-9314-c606f72b4534" containerName="mariadb-account-delete" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264946 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ad70b3-4188-42eb-9314-c606f72b4534" containerName="mariadb-account-delete" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264956 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264964 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker-log" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264974 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264981 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener" Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.264990 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.264996 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265099 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265108 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265116 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="47699c41-4940-4aec-9644-93dfea90094b" containerName="barbican-api" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265124 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c61cde-2f52-4c63-a78f-683011ce9a51" containerName="barbican-worker-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265131 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265141 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265148 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265156 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265167 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a26f1d-d188-41ce-8a4f-bc52a2d8492f" containerName="barbican-api" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265175 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49c04aa-14d4-4d76-ae1f-73c8aa193b2c" containerName="barbican-worker" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265182 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="398688ee-ba77-4410-8294-42eaffb91650" containerName="barbican-keystone-listener" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265190 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="9184103b-f4bc-413e-adda-26d6e0da77a3" containerName="barbican-keystone-listener-log" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265199 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ad70b3-4188-42eb-9314-c606f72b4534" containerName="mariadb-account-delete" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.265661 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.279221 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystoneee83-account-delete-6nzk7"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.297439 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkj6w\" (UniqueName: \"kubernetes.io/projected/d285f14a-7c11-4d3c-8abc-188a4e2573ce-kube-api-access-rkj6w\") pod \"keystoneee83-account-delete-6nzk7\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.297526 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts\") pod \"keystoneee83-account-delete-6nzk7\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.399022 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts\") pod \"keystoneee83-account-delete-6nzk7\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.399135 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkj6w\" (UniqueName: \"kubernetes.io/projected/d285f14a-7c11-4d3c-8abc-188a4e2573ce-kube-api-access-rkj6w\") pod \"keystoneee83-account-delete-6nzk7\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.399883 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts\") pod \"keystoneee83-account-delete-6nzk7\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.421210 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkj6w\" (UniqueName: \"kubernetes.io/projected/d285f14a-7c11-4d3c-8abc-188a4e2573ce-kube-api-access-rkj6w\") pod \"keystoneee83-account-delete-6nzk7\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.587519 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.875916 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-hnkqn"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.909816 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-hnkqn"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.936371 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-m65ml"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.939618 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.941969 4819 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.952416 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.961440 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-m65ml"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.973684 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.980106 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Feb 28 03:54:45 crc kubenswrapper[4819]: I0228 03:54:45.989499 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-m65ml"] Feb 28 03:54:45 crc kubenswrapper[4819]: E0228 03:54:45.989715 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-pv62g operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="barbican-kuttl-tests/root-account-create-update-m65ml" podUID="32bcbe76-ddf7-486d-b750-ae97beee3662" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.010482 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv62g\" (UniqueName: \"kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g\") pod \"root-account-create-update-m65ml\" (UID: \"32bcbe76-ddf7-486d-b750-ae97beee3662\") " pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.010622 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts\") pod \"root-account-create-update-m65ml\" (UID: \"32bcbe76-ddf7-486d-b750-ae97beee3662\") " pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.083415 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystoneee83-account-delete-6nzk7"] Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.112225 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts\") pod \"root-account-create-update-m65ml\" (UID: \"32bcbe76-ddf7-486d-b750-ae97beee3662\") " pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.112335 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv62g\" (UniqueName: \"kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g\") pod \"root-account-create-update-m65ml\" (UID: \"32bcbe76-ddf7-486d-b750-ae97beee3662\") " pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.112350 4819 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.112429 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts podName:32bcbe76-ddf7-486d-b750-ae97beee3662 nodeName:}" failed. No retries permitted until 2026-02-28 03:54:46.612409704 +0000 UTC m=+1225.077978562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts") pod "root-account-create-update-m65ml" (UID: "32bcbe76-ddf7-486d-b750-ae97beee3662") : configmap "openstack-scripts" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.116513 4819 projected.go:194] Error preparing data for projected volume kube-api-access-pv62g for pod barbican-kuttl-tests/root-account-create-update-m65ml: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.116562 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g podName:32bcbe76-ddf7-486d-b750-ae97beee3662 nodeName:}" failed. No retries permitted until 2026-02-28 03:54:46.616551046 +0000 UTC m=+1225.082119904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pv62g" (UniqueName: "kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g") pod "root-account-create-update-m65ml" (UID: "32bcbe76-ddf7-486d-b750-ae97beee3662") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.143942 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-2" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="galera" containerID="cri-o://eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f" gracePeriod=30 Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.377199 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5424ceda-ffd2-4956-92f4-98e527ee26d3" path="/var/lib/kubelet/pods/5424ceda-ffd2-4956-92f4-98e527ee26d3/volumes" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.377721 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6feb748-e921-4c51-992d-0a07a5afd987" path="/var/lib/kubelet/pods/a6feb748-e921-4c51-992d-0a07a5afd987/volumes" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.378180 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2700e78-452b-498f-8620-b94999aac328" path="/var/lib/kubelet/pods/e2700e78-452b-498f-8620-b94999aac328/volumes" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.415554 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.415559 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" event={"ID":"d285f14a-7c11-4d3c-8abc-188a4e2573ce","Type":"ContainerStarted","Data":"e08c94e93e112780e74dd029c45d4278a83fd117a8da48979682a098e2cbcca7"} Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.415623 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" event={"ID":"d285f14a-7c11-4d3c-8abc-188a4e2573ce","Type":"ContainerStarted","Data":"1b16f7d80ed3d27d6b22593ab3eacafc274bc690785ca8f587d9fd3bcbe2883e"} Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.416184 4819 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" secret="" err="secret \"galera-openstack-dockercfg-gb29b\" not found" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.423332 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.435149 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" podStartSLOduration=1.435128397 podStartE2EDuration="1.435128397s" podCreationTimestamp="2026-02-28 03:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 03:54:46.432972874 +0000 UTC m=+1224.898541762" watchObservedRunningTime="2026-02-28 03:54:46.435128397 +0000 UTC m=+1224.900697295" Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.519058 4819 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.519161 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts podName:d285f14a-7c11-4d3c-8abc-188a4e2573ce nodeName:}" failed. No retries permitted until 2026-02-28 03:54:47.019136804 +0000 UTC m=+1225.484705682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts") pod "keystoneee83-account-delete-6nzk7" (UID: "d285f14a-7c11-4d3c-8abc-188a4e2573ce") : configmap "openstack-scripts" not found Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.521784 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.521959 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/memcached-0" podUID="c684da03-6893-45f7-833c-2e71ad6c7e47" containerName="memcached" containerID="cri-o://9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0" gracePeriod=30 Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.620323 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv62g\" (UniqueName: \"kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g\") pod \"root-account-create-update-m65ml\" (UID: \"32bcbe76-ddf7-486d-b750-ae97beee3662\") " pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.620520 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts\") pod \"root-account-create-update-m65ml\" (UID: \"32bcbe76-ddf7-486d-b750-ae97beee3662\") " pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.620644 4819 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.620731 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts podName:32bcbe76-ddf7-486d-b750-ae97beee3662 nodeName:}" failed. No retries permitted until 2026-02-28 03:54:47.620707344 +0000 UTC m=+1226.086276232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts") pod "root-account-create-update-m65ml" (UID: "32bcbe76-ddf7-486d-b750-ae97beee3662") : configmap "openstack-scripts" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.624121 4819 projected.go:194] Error preparing data for projected volume kube-api-access-pv62g for pod barbican-kuttl-tests/root-account-create-update-m65ml: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 28 03:54:46 crc kubenswrapper[4819]: E0228 03:54:46.624219 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g podName:32bcbe76-ddf7-486d-b750-ae97beee3662 nodeName:}" failed. No retries permitted until 2026-02-28 03:54:47.62419287 +0000 UTC m=+1226.089761768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pv62g" (UniqueName: "kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g") pod "root-account-create-update-m65ml" (UID: "32bcbe76-ddf7-486d-b750-ae97beee3662") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.883448 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:54:46 crc kubenswrapper[4819]: I0228 03:54:46.983908 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.018693 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:54:47 crc kubenswrapper[4819]: E0228 03:54:47.027188 4819 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 03:54:47 crc kubenswrapper[4819]: E0228 03:54:47.027240 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts podName:d285f14a-7c11-4d3c-8abc-188a4e2573ce nodeName:}" failed. No retries permitted until 2026-02-28 03:54:48.027226519 +0000 UTC m=+1226.492795377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts") pod "keystoneee83-account-delete-6nzk7" (UID: "d285f14a-7c11-4d3c-8abc-188a4e2573ce") : configmap "openstack-scripts" not found Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.128494 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-operator-scripts\") pod \"9b16747d-1b6b-44ee-896e-0ead9587deeb\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.128689 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-kolla-config\") pod \"9b16747d-1b6b-44ee-896e-0ead9587deeb\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.128740 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"9b16747d-1b6b-44ee-896e-0ead9587deeb\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.128781 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-default\") pod \"9b16747d-1b6b-44ee-896e-0ead9587deeb\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.128809 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2clhx\" (UniqueName: \"kubernetes.io/projected/9b16747d-1b6b-44ee-896e-0ead9587deeb-kube-api-access-2clhx\") pod \"9b16747d-1b6b-44ee-896e-0ead9587deeb\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.128854 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-generated\") pod \"9b16747d-1b6b-44ee-896e-0ead9587deeb\" (UID: \"9b16747d-1b6b-44ee-896e-0ead9587deeb\") " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.129303 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b16747d-1b6b-44ee-896e-0ead9587deeb" (UID: "9b16747d-1b6b-44ee-896e-0ead9587deeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.129501 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9b16747d-1b6b-44ee-896e-0ead9587deeb" (UID: "9b16747d-1b6b-44ee-896e-0ead9587deeb"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.130216 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9b16747d-1b6b-44ee-896e-0ead9587deeb" (UID: "9b16747d-1b6b-44ee-896e-0ead9587deeb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.134768 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9b16747d-1b6b-44ee-896e-0ead9587deeb" (UID: "9b16747d-1b6b-44ee-896e-0ead9587deeb"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.138521 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b16747d-1b6b-44ee-896e-0ead9587deeb-kube-api-access-2clhx" (OuterVolumeSpecName: "kube-api-access-2clhx") pod "9b16747d-1b6b-44ee-896e-0ead9587deeb" (UID: "9b16747d-1b6b-44ee-896e-0ead9587deeb"). InnerVolumeSpecName "kube-api-access-2clhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.141876 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "9b16747d-1b6b-44ee-896e-0ead9587deeb" (UID: "9b16747d-1b6b-44ee-896e-0ead9587deeb"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.230955 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.231020 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2clhx\" (UniqueName: \"kubernetes.io/projected/9b16747d-1b6b-44ee-896e-0ead9587deeb-kube-api-access-2clhx\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.231037 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9b16747d-1b6b-44ee-896e-0ead9587deeb-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.231050 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.231064 4819 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b16747d-1b6b-44ee-896e-0ead9587deeb-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.231116 4819 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.249396 4819 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.333041 4819 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.426941 4819 generic.go:334] "Generic (PLEG): container finished" podID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerID="e08c94e93e112780e74dd029c45d4278a83fd117a8da48979682a098e2cbcca7" exitCode=1 Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.427007 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" event={"ID":"d285f14a-7c11-4d3c-8abc-188a4e2573ce","Type":"ContainerDied","Data":"e08c94e93e112780e74dd029c45d4278a83fd117a8da48979682a098e2cbcca7"} Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.427517 4819 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" secret="" err="secret \"galera-openstack-dockercfg-gb29b\" not found" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.427568 4819 scope.go:117] "RemoveContainer" containerID="e08c94e93e112780e74dd029c45d4278a83fd117a8da48979682a098e2cbcca7" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.431312 4819 generic.go:334] "Generic (PLEG): container finished" podID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerID="eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f" exitCode=0 Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.431353 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-m65ml" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.431897 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9b16747d-1b6b-44ee-896e-0ead9587deeb","Type":"ContainerDied","Data":"eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f"} Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.431954 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"9b16747d-1b6b-44ee-896e-0ead9587deeb","Type":"ContainerDied","Data":"c68709d983c6c4c4a3ac3b30bccfa9121a26c11a1d38b49fb69311681a23f363"} Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.431980 4819 scope.go:117] "RemoveContainer" containerID="eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.432121 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.499939 4819 scope.go:117] "RemoveContainer" containerID="5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.525233 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" containerName="rabbitmq" containerID="cri-o://d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3" gracePeriod=604800 Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.530474 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-m65ml"] Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.541994 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-m65ml"] Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.557029 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.561236 4819 scope.go:117] "RemoveContainer" containerID="eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f" Feb 28 03:54:47 crc kubenswrapper[4819]: E0228 03:54:47.561766 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f\": container with ID starting with eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f not found: ID does not exist" containerID="eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.561825 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f"} err="failed to get container status \"eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f\": rpc error: code = NotFound desc = could not find container \"eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f\": container with ID starting with eae6bca6dff0a0d4fc4f67ff61946a049241847b6db1e3d7081732571d7c606f not found: ID does not exist" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.561860 4819 scope.go:117] "RemoveContainer" containerID="5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f" Feb 28 03:54:47 crc kubenswrapper[4819]: E0228 03:54:47.562206 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f\": container with ID starting with 5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f not found: ID does not exist" containerID="5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.562264 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f"} err="failed to get container status \"5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f\": rpc error: code = NotFound desc = could not find container \"5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f\": container with ID starting with 5751dc9aca51d3287c39920fb23fa4bdb279b8f3628e189f705a0de710b2189f not found: ID does not exist" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.571128 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.642544 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv62g\" (UniqueName: \"kubernetes.io/projected/32bcbe76-ddf7-486d-b750-ae97beee3662-kube-api-access-pv62g\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.642599 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32bcbe76-ddf7-486d-b750-ae97beee3662-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.960218 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd"] Feb 28 03:54:47 crc kubenswrapper[4819]: I0228 03:54:47.960481 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" podUID="b49fd157-5376-4da3-8d0d-11a9218ce42b" containerName="manager" containerID="cri-o://8d0499475c8b14ac0ab72990fa6f2c6c91c665fa22b5935f98f254f819afc866" gracePeriod=10 Feb 28 03:54:48 crc kubenswrapper[4819]: E0228 03:54:48.048440 4819 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 03:54:48 crc kubenswrapper[4819]: E0228 03:54:48.048513 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts podName:d285f14a-7c11-4d3c-8abc-188a4e2573ce nodeName:}" failed. No retries permitted until 2026-02-28 03:54:50.048496852 +0000 UTC m=+1228.514065710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts") pod "keystoneee83-account-delete-6nzk7" (UID: "d285f14a-7c11-4d3c-8abc-188a4e2573ce") : configmap "openstack-scripts" not found Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.178549 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-jb4rj"] Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.178741 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-jb4rj" podUID="82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" containerName="registry-server" containerID="cri-o://d1eb67778e8cb89f7837f042bd88806819c80e72bee56a9ac1e30c8f8c7b08cc" gracePeriod=30 Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.198869 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-1" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerName="galera" containerID="cri-o://5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb" gracePeriod=28 Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.217201 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr"] Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.221505 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/2c7cc19ff9f5741aa8fac9cd59d7ba62b2def16052061ea7bbd4af23f6gs6rr"] Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.383036 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284bc0b4-fcb8-4b80-94f1-0b232de6684f" path="/var/lib/kubelet/pods/284bc0b4-fcb8-4b80-94f1-0b232de6684f/volumes" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.383881 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bcbe76-ddf7-486d-b750-ae97beee3662" path="/var/lib/kubelet/pods/32bcbe76-ddf7-486d-b750-ae97beee3662/volumes" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.384221 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" path="/var/lib/kubelet/pods/9b16747d-1b6b-44ee-896e-0ead9587deeb/volumes" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.403955 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.442409 4819 generic.go:334] "Generic (PLEG): container finished" podID="82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" containerID="d1eb67778e8cb89f7837f042bd88806819c80e72bee56a9ac1e30c8f8c7b08cc" exitCode=0 Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.442459 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-jb4rj" event={"ID":"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc","Type":"ContainerDied","Data":"d1eb67778e8cb89f7837f042bd88806819c80e72bee56a9ac1e30c8f8c7b08cc"} Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.444377 4819 generic.go:334] "Generic (PLEG): container finished" podID="c684da03-6893-45f7-833c-2e71ad6c7e47" containerID="9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0" exitCode=0 Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.444415 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"c684da03-6893-45f7-833c-2e71ad6c7e47","Type":"ContainerDied","Data":"9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0"} Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.444431 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"c684da03-6893-45f7-833c-2e71ad6c7e47","Type":"ContainerDied","Data":"2f46cf4ac61759660ed083d4f4e98911b369237af71213522ae18fc1c5e3c2c8"} Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.444446 4819 scope.go:117] "RemoveContainer" containerID="9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.444516 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.448520 4819 generic.go:334] "Generic (PLEG): container finished" podID="b49fd157-5376-4da3-8d0d-11a9218ce42b" containerID="8d0499475c8b14ac0ab72990fa6f2c6c91c665fa22b5935f98f254f819afc866" exitCode=0 Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.448565 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" event={"ID":"b49fd157-5376-4da3-8d0d-11a9218ce42b","Type":"ContainerDied","Data":"8d0499475c8b14ac0ab72990fa6f2c6c91c665fa22b5935f98f254f819afc866"} Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.451933 4819 generic.go:334] "Generic (PLEG): container finished" podID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerID="e5baa0b7f903de16b237641506c9f6d76d37f67bb89fd97b4e62c89ede7cf5da" exitCode=1 Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.452030 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" event={"ID":"d285f14a-7c11-4d3c-8abc-188a4e2573ce","Type":"ContainerDied","Data":"e5baa0b7f903de16b237641506c9f6d76d37f67bb89fd97b4e62c89ede7cf5da"} Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.452437 4819 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" secret="" err="secret \"galera-openstack-dockercfg-gb29b\" not found" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.452477 4819 scope.go:117] "RemoveContainer" containerID="e5baa0b7f903de16b237641506c9f6d76d37f67bb89fd97b4e62c89ede7cf5da" Feb 28 03:54:48 crc kubenswrapper[4819]: E0228 03:54:48.452692 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystoneee83-account-delete-6nzk7_barbican-kuttl-tests(d285f14a-7c11-4d3c-8abc-188a4e2573ce)\"" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.453145 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmqc8\" (UniqueName: \"kubernetes.io/projected/c684da03-6893-45f7-833c-2e71ad6c7e47-kube-api-access-rmqc8\") pod \"c684da03-6893-45f7-833c-2e71ad6c7e47\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.453359 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-config-data\") pod \"c684da03-6893-45f7-833c-2e71ad6c7e47\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.453426 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-kolla-config\") pod \"c684da03-6893-45f7-833c-2e71ad6c7e47\" (UID: \"c684da03-6893-45f7-833c-2e71ad6c7e47\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.454176 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-config-data" (OuterVolumeSpecName: "config-data") pod "c684da03-6893-45f7-833c-2e71ad6c7e47" (UID: "c684da03-6893-45f7-833c-2e71ad6c7e47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.454871 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c684da03-6893-45f7-833c-2e71ad6c7e47" (UID: "c684da03-6893-45f7-833c-2e71ad6c7e47"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.458518 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c684da03-6893-45f7-833c-2e71ad6c7e47-kube-api-access-rmqc8" (OuterVolumeSpecName: "kube-api-access-rmqc8") pod "c684da03-6893-45f7-833c-2e71ad6c7e47" (UID: "c684da03-6893-45f7-833c-2e71ad6c7e47"). InnerVolumeSpecName "kube-api-access-rmqc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.471444 4819 scope.go:117] "RemoveContainer" containerID="9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0" Feb 28 03:54:48 crc kubenswrapper[4819]: E0228 03:54:48.471915 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0\": container with ID starting with 9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0 not found: ID does not exist" containerID="9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.471958 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0"} err="failed to get container status \"9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0\": rpc error: code = NotFound desc = could not find container \"9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0\": container with ID starting with 9b9b6c3e6a833922fd381ff9d8ce5746b6aa282062581da8ff5ff85a15368ae0 not found: ID does not exist" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.472216 4819 scope.go:117] "RemoveContainer" containerID="e08c94e93e112780e74dd029c45d4278a83fd117a8da48979682a098e2cbcca7" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.477037 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.555934 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-apiservice-cert\") pod \"b49fd157-5376-4da3-8d0d-11a9218ce42b\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.555991 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ghfn\" (UniqueName: \"kubernetes.io/projected/b49fd157-5376-4da3-8d0d-11a9218ce42b-kube-api-access-4ghfn\") pod \"b49fd157-5376-4da3-8d0d-11a9218ce42b\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.556021 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-webhook-cert\") pod \"b49fd157-5376-4da3-8d0d-11a9218ce42b\" (UID: \"b49fd157-5376-4da3-8d0d-11a9218ce42b\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.556199 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmqc8\" (UniqueName: \"kubernetes.io/projected/c684da03-6893-45f7-833c-2e71ad6c7e47-kube-api-access-rmqc8\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.556214 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.556225 4819 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c684da03-6893-45f7-833c-2e71ad6c7e47-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.561380 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "b49fd157-5376-4da3-8d0d-11a9218ce42b" (UID: "b49fd157-5376-4da3-8d0d-11a9218ce42b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.561477 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49fd157-5376-4da3-8d0d-11a9218ce42b-kube-api-access-4ghfn" (OuterVolumeSpecName: "kube-api-access-4ghfn") pod "b49fd157-5376-4da3-8d0d-11a9218ce42b" (UID: "b49fd157-5376-4da3-8d0d-11a9218ce42b"). InnerVolumeSpecName "kube-api-access-4ghfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.562029 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "b49fd157-5376-4da3-8d0d-11a9218ce42b" (UID: "b49fd157-5376-4da3-8d0d-11a9218ce42b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.586301 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.657378 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmqb4\" (UniqueName: \"kubernetes.io/projected/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc-kube-api-access-gmqb4\") pod \"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc\" (UID: \"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.657651 4819 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.657668 4819 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b49fd157-5376-4da3-8d0d-11a9218ce42b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.657710 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ghfn\" (UniqueName: \"kubernetes.io/projected/b49fd157-5376-4da3-8d0d-11a9218ce42b-kube-api-access-4ghfn\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.663465 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc-kube-api-access-gmqb4" (OuterVolumeSpecName: "kube-api-access-gmqb4") pod "82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" (UID: "82ee77d5-3f8c-42b9-8025-6c6c73fa17fc"). InnerVolumeSpecName "kube-api-access-gmqb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.759275 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmqb4\" (UniqueName: \"kubernetes.io/projected/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc-kube-api-access-gmqb4\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.806827 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.818597 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.822296 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.859927 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-scripts\") pod \"c66de22e-9900-45fe-b074-455829b4084a\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.860000 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-config-data\") pod \"c66de22e-9900-45fe-b074-455829b4084a\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.860086 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-credential-keys\") pod \"c66de22e-9900-45fe-b074-455829b4084a\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.860111 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr465\" (UniqueName: \"kubernetes.io/projected/c66de22e-9900-45fe-b074-455829b4084a-kube-api-access-pr465\") pod \"c66de22e-9900-45fe-b074-455829b4084a\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.860205 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-fernet-keys\") pod \"c66de22e-9900-45fe-b074-455829b4084a\" (UID: \"c66de22e-9900-45fe-b074-455829b4084a\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.863946 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-scripts" (OuterVolumeSpecName: "scripts") pod "c66de22e-9900-45fe-b074-455829b4084a" (UID: "c66de22e-9900-45fe-b074-455829b4084a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.864280 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c66de22e-9900-45fe-b074-455829b4084a" (UID: "c66de22e-9900-45fe-b074-455829b4084a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.864512 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66de22e-9900-45fe-b074-455829b4084a-kube-api-access-pr465" (OuterVolumeSpecName: "kube-api-access-pr465") pod "c66de22e-9900-45fe-b074-455829b4084a" (UID: "c66de22e-9900-45fe-b074-455829b4084a"). InnerVolumeSpecName "kube-api-access-pr465". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.864784 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c66de22e-9900-45fe-b074-455829b4084a" (UID: "c66de22e-9900-45fe-b074-455829b4084a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.894436 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-config-data" (OuterVolumeSpecName: "config-data") pod "c66de22e-9900-45fe-b074-455829b4084a" (UID: "c66de22e-9900-45fe-b074-455829b4084a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.915369 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.961696 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8575a62-7205-495b-80ed-2c715e87cc72-plugins-conf\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.961755 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-erlang-cookie\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.961815 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8575a62-7205-495b-80ed-2c715e87cc72-erlang-cookie-secret\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.961843 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-confd\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.961875 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-plugins\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.961913 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8575a62-7205-495b-80ed-2c715e87cc72-pod-info\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962041 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962085 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n4g8\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-kube-api-access-2n4g8\") pod \"c8575a62-7205-495b-80ed-2c715e87cc72\" (UID: \"c8575a62-7205-495b-80ed-2c715e87cc72\") " Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962211 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8575a62-7205-495b-80ed-2c715e87cc72-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962663 4819 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962692 4819 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-config-data\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962726 4819 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962741 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr465\" (UniqueName: \"kubernetes.io/projected/c66de22e-9900-45fe-b074-455829b4084a-kube-api-access-pr465\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962754 4819 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8575a62-7205-495b-80ed-2c715e87cc72-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.962765 4819 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c66de22e-9900-45fe-b074-455829b4084a-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.963215 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.963623 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.965020 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-kube-api-access-2n4g8" (OuterVolumeSpecName: "kube-api-access-2n4g8") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "kube-api-access-2n4g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.965866 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8575a62-7205-495b-80ed-2c715e87cc72-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.966310 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c8575a62-7205-495b-80ed-2c715e87cc72-pod-info" (OuterVolumeSpecName: "pod-info") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 28 03:54:48 crc kubenswrapper[4819]: I0228 03:54:48.970279 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d" (OuterVolumeSpecName: "persistence") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.018744 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c8575a62-7205-495b-80ed-2c715e87cc72" (UID: "c8575a62-7205-495b-80ed-2c715e87cc72"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.064533 4819 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.064859 4819 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8575a62-7205-495b-80ed-2c715e87cc72-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.064935 4819 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.064989 4819 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8575a62-7205-495b-80ed-2c715e87cc72-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.065056 4819 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8575a62-7205-495b-80ed-2c715e87cc72-pod-info\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.065157 4819 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") on node \"crc\" " Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.065227 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n4g8\" (UniqueName: \"kubernetes.io/projected/c8575a62-7205-495b-80ed-2c715e87cc72-kube-api-access-2n4g8\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.077365 4819 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.077636 4819 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d") on node "crc" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.167330 4819 reconciler_common.go:293] "Volume detached for volume \"pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2f8e89f-3215-4109-8e92-25ac93d2e55d\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.461934 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.461938 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd" event={"ID":"b49fd157-5376-4da3-8d0d-11a9218ce42b","Type":"ContainerDied","Data":"39946b790522f225db17ce25624cd5d153ba5290337061da44f8e2412253de77"} Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.462216 4819 scope.go:117] "RemoveContainer" containerID="8d0499475c8b14ac0ab72990fa6f2c6c91c665fa22b5935f98f254f819afc866" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.464728 4819 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" secret="" err="secret \"galera-openstack-dockercfg-gb29b\" not found" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.464811 4819 scope.go:117] "RemoveContainer" containerID="e5baa0b7f903de16b237641506c9f6d76d37f67bb89fd97b4e62c89ede7cf5da" Feb 28 03:54:49 crc kubenswrapper[4819]: E0228 03:54:49.465273 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystoneee83-account-delete-6nzk7_barbican-kuttl-tests(d285f14a-7c11-4d3c-8abc-188a4e2573ce)\"" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.470697 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-jb4rj" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.471288 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-jb4rj" event={"ID":"82ee77d5-3f8c-42b9-8025-6c6c73fa17fc","Type":"ContainerDied","Data":"539d9bb2be4f1108e7298473daad3a65c67c95507973367f84762c8efc1092d3"} Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.478718 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"c8575a62-7205-495b-80ed-2c715e87cc72","Type":"ContainerDied","Data":"d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3"} Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.478830 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.478718 4819 generic.go:334] "Generic (PLEG): container finished" podID="c8575a62-7205-495b-80ed-2c715e87cc72" containerID="d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3" exitCode=0 Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.478943 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"c8575a62-7205-495b-80ed-2c715e87cc72","Type":"ContainerDied","Data":"553758d51adb878b84a876cd4bbb2a8379b8c408688813adeb28f7c08a847dee"} Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.481124 4819 generic.go:334] "Generic (PLEG): container finished" podID="c66de22e-9900-45fe-b074-455829b4084a" containerID="f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259" exitCode=0 Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.481164 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" event={"ID":"c66de22e-9900-45fe-b074-455829b4084a","Type":"ContainerDied","Data":"f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259"} Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.481191 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" event={"ID":"c66de22e-9900-45fe-b074-455829b4084a","Type":"ContainerDied","Data":"2e00e8496511b46b87434c0377b4dc040a9cbb399c62cfa317b8a43d7f9aa06b"} Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.481493 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.493632 4819 scope.go:117] "RemoveContainer" containerID="d1eb67778e8cb89f7837f042bd88806819c80e72bee56a9ac1e30c8f8c7b08cc" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.516908 4819 scope.go:117] "RemoveContainer" containerID="d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.525104 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.530393 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55cbccd744-wl4fd"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.536236 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-jb4rj"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.539700 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-jb4rj"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.545486 4819 scope.go:117] "RemoveContainer" containerID="550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.547473 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.552628 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.566382 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.569082 4819 scope.go:117] "RemoveContainer" containerID="d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3" Feb 28 03:54:49 crc kubenswrapper[4819]: E0228 03:54:49.569710 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3\": container with ID starting with d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3 not found: ID does not exist" containerID="d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.569754 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3"} err="failed to get container status \"d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3\": rpc error: code = NotFound desc = could not find container \"d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3\": container with ID starting with d21b87233e516646cdb52e026677c15a58e1995e8550e4aa7626c027e74a16d3 not found: ID does not exist" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.569785 4819 scope.go:117] "RemoveContainer" containerID="550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa" Feb 28 03:54:49 crc kubenswrapper[4819]: E0228 03:54:49.570150 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa\": container with ID starting with 550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa not found: ID does not exist" containerID="550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.570173 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa"} err="failed to get container status \"550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa\": rpc error: code = NotFound desc = could not find container \"550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa\": container with ID starting with 550bbd8efb34a3b9d8b116ea2a8762408b4251cd7f20d489a8bb20a4d1b430aa not found: ID does not exist" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.570190 4819 scope.go:117] "RemoveContainer" containerID="f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.570650 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.590849 4819 scope.go:117] "RemoveContainer" containerID="f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259" Feb 28 03:54:49 crc kubenswrapper[4819]: E0228 03:54:49.591652 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259\": container with ID starting with f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259 not found: ID does not exist" containerID="f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.591682 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259"} err="failed to get container status \"f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259\": rpc error: code = NotFound desc = could not find container \"f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259\": container with ID starting with f77e7ccb2f8ec9f7b68e0250b631b7e35b68e4d4b87e396c7c0e385f55752259 not found: ID does not exist" Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.971510 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m"] Feb 28 03:54:49 crc kubenswrapper[4819]: I0228 03:54:49.972027 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" podUID="8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" containerName="manager" containerID="cri-o://080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40" gracePeriod=10 Feb 28 03:54:50 crc kubenswrapper[4819]: E0228 03:54:50.082845 4819 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Feb 28 03:54:50 crc kubenswrapper[4819]: E0228 03:54:50.082941 4819 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts podName:d285f14a-7c11-4d3c-8abc-188a4e2573ce nodeName:}" failed. No retries permitted until 2026-02-28 03:54:54.082918699 +0000 UTC m=+1232.548487607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts") pod "keystoneee83-account-delete-6nzk7" (UID: "d285f14a-7c11-4d3c-8abc-188a4e2573ce") : configmap "openstack-scripts" not found Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.146509 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.183846 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kolla-config\") pod \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.184000 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-operator-scripts\") pod \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.184083 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cglx8\" (UniqueName: \"kubernetes.io/projected/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kube-api-access-cglx8\") pod \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.184193 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-generated\") pod \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.184304 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-default\") pod \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.184380 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\" (UID: \"a18d0ed2-f5db-4f32-b635-7956eeee1f01\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.186086 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a18d0ed2-f5db-4f32-b635-7956eeee1f01" (UID: "a18d0ed2-f5db-4f32-b635-7956eeee1f01"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.186857 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a18d0ed2-f5db-4f32-b635-7956eeee1f01" (UID: "a18d0ed2-f5db-4f32-b635-7956eeee1f01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.187778 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a18d0ed2-f5db-4f32-b635-7956eeee1f01" (UID: "a18d0ed2-f5db-4f32-b635-7956eeee1f01"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.187925 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a18d0ed2-f5db-4f32-b635-7956eeee1f01" (UID: "a18d0ed2-f5db-4f32-b635-7956eeee1f01"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.198796 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kube-api-access-cglx8" (OuterVolumeSpecName: "kube-api-access-cglx8") pod "a18d0ed2-f5db-4f32-b635-7956eeee1f01" (UID: "a18d0ed2-f5db-4f32-b635-7956eeee1f01"). InnerVolumeSpecName "kube-api-access-cglx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.215725 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "a18d0ed2-f5db-4f32-b635-7956eeee1f01" (UID: "a18d0ed2-f5db-4f32-b635-7956eeee1f01"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.226330 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-92zwp"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.226606 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-92zwp" podUID="967804ee-64cc-4594-900d-be115f006e13" containerName="registry-server" containerID="cri-o://9e4ad6facbc1340ef90d01f1148163fc8e33c7f10755df06a4cad94a2218a18a" gracePeriod=30 Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.269628 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.279407 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/8845177aedda4f0f836d07ec11d403c737142048dd780f901ecfa77a589vkwn"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.287859 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-lgl97"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.289127 4819 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.289146 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.289162 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cglx8\" (UniqueName: \"kubernetes.io/projected/a18d0ed2-f5db-4f32-b635-7956eeee1f01-kube-api-access-cglx8\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.289174 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.289185 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a18d0ed2-f5db-4f32-b635-7956eeee1f01-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.289211 4819 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.294272 4819 scope.go:117] "RemoveContainer" containerID="bf95251f6ec7d2a5705c95048ed01fcaa6d3401462b3877b8500199ef85a2f72" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.296352 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-lgl97"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.303710 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-0" podUID="81cae302-997a-482b-b76a-90b1172083b1" containerName="galera" containerID="cri-o://3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e" gracePeriod=26 Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.304618 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystoneee83-account-delete-6nzk7"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.309036 4819 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.316407 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.320112 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-ee83-account-create-update-5qtv5"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.360583 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.382610 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778a63de-9182-440d-9a6c-b9526ccb40fe" path="/var/lib/kubelet/pods/778a63de-9182-440d-9a6c-b9526ccb40fe/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.383925 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c428fea-2d2c-4e5c-9244-8eedf6cae97f" path="/var/lib/kubelet/pods/7c428fea-2d2c-4e5c-9244-8eedf6cae97f/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.384419 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" path="/var/lib/kubelet/pods/82ee77d5-3f8c-42b9-8025-6c6c73fa17fc/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.384838 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49fd157-5376-4da3-8d0d-11a9218ce42b" path="/var/lib/kubelet/pods/b49fd157-5376-4da3-8d0d-11a9218ce42b/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.386176 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66de22e-9900-45fe-b074-455829b4084a" path="/var/lib/kubelet/pods/c66de22e-9900-45fe-b074-455829b4084a/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.386769 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c684da03-6893-45f7-833c-2e71ad6c7e47" path="/var/lib/kubelet/pods/c684da03-6893-45f7-833c-2e71ad6c7e47/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.387224 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f29299-f69a-4e2c-938a-59404c33d64c" path="/var/lib/kubelet/pods/c6f29299-f69a-4e2c-938a-59404c33d64c/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.388305 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" path="/var/lib/kubelet/pods/c8575a62-7205-495b-80ed-2c715e87cc72/volumes" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.390149 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-webhook-cert\") pod \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.390872 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jhxv\" (UniqueName: \"kubernetes.io/projected/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-kube-api-access-7jhxv\") pod \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.390908 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-apiservice-cert\") pod \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\" (UID: \"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.391231 4819 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.393947 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-kube-api-access-7jhxv" (OuterVolumeSpecName: "kube-api-access-7jhxv") pod "8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" (UID: "8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c"). InnerVolumeSpecName "kube-api-access-7jhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.396892 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" (UID: "8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.396984 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" (UID: "8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.492992 4819 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.493020 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jhxv\" (UniqueName: \"kubernetes.io/projected/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-kube-api-access-7jhxv\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.493036 4819 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.497736 4819 generic.go:334] "Generic (PLEG): container finished" podID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerID="5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb" exitCode=0 Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.497807 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"a18d0ed2-f5db-4f32-b635-7956eeee1f01","Type":"ContainerDied","Data":"5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb"} Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.497838 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"a18d0ed2-f5db-4f32-b635-7956eeee1f01","Type":"ContainerDied","Data":"d3abc08dd4cb84d0f83276eb36a360e01f33b59023d96e2be34ae56ac34cd48f"} Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.497861 4819 scope.go:117] "RemoveContainer" containerID="5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.497962 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.502612 4819 generic.go:334] "Generic (PLEG): container finished" podID="967804ee-64cc-4594-900d-be115f006e13" containerID="9e4ad6facbc1340ef90d01f1148163fc8e33c7f10755df06a4cad94a2218a18a" exitCode=0 Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.502704 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-92zwp" event={"ID":"967804ee-64cc-4594-900d-be115f006e13","Type":"ContainerDied","Data":"9e4ad6facbc1340ef90d01f1148163fc8e33c7f10755df06a4cad94a2218a18a"} Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.523664 4819 generic.go:334] "Generic (PLEG): container finished" podID="8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" containerID="080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40" exitCode=0 Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.524138 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.525414 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.525462 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" event={"ID":"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c","Type":"ContainerDied","Data":"080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40"} Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.525493 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m" event={"ID":"8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c","Type":"ContainerDied","Data":"fd177a251b1db2bf83d95162348b32906e52321b80ffea711d261942676bad28"} Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.544444 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.546495 4819 scope.go:117] "RemoveContainer" containerID="5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d" Feb 28 03:54:50 crc kubenswrapper[4819]: E0228 03:54:50.567106 4819 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/openstack-galera-1_barbican-kuttl-tests_mysql-bootstrap-5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d.log: no such file or directory" path="/var/log/containers/openstack-galera-1_barbican-kuttl-tests_mysql-bootstrap-5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d.log" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.577824 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.584092 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7959cbcbf4-vs45m"] Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.624061 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.634983 4819 scope.go:117] "RemoveContainer" containerID="5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb" Feb 28 03:54:50 crc kubenswrapper[4819]: E0228 03:54:50.635612 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb\": container with ID starting with 5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb not found: ID does not exist" containerID="5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.635658 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb"} err="failed to get container status \"5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb\": rpc error: code = NotFound desc = could not find container \"5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb\": container with ID starting with 5d856a2c4f42a733de4d8ed6f99394033b6067d937855876aeba3a1c2045d5bb not found: ID does not exist" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.635685 4819 scope.go:117] "RemoveContainer" containerID="5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d" Feb 28 03:54:50 crc kubenswrapper[4819]: E0228 03:54:50.636424 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d\": container with ID starting with 5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d not found: ID does not exist" containerID="5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.636471 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d"} err="failed to get container status \"5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d\": rpc error: code = NotFound desc = could not find container \"5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d\": container with ID starting with 5cff22d31ad00f3fd2daf2cbdae48cb03fb0af2f6dfb5567a7a6e6f49390662d not found: ID does not exist" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.636493 4819 scope.go:117] "RemoveContainer" containerID="080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.658170 4819 scope.go:117] "RemoveContainer" containerID="080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40" Feb 28 03:54:50 crc kubenswrapper[4819]: E0228 03:54:50.658593 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40\": container with ID starting with 080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40 not found: ID does not exist" containerID="080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.658620 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40"} err="failed to get container status \"080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40\": rpc error: code = NotFound desc = could not find container \"080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40\": container with ID starting with 080db5e5c921bc449ed1d8fc881b30e5edc96e471fba802e314820561994aa40 not found: ID does not exist" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.700172 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxhm\" (UniqueName: \"kubernetes.io/projected/967804ee-64cc-4594-900d-be115f006e13-kube-api-access-kgxhm\") pod \"967804ee-64cc-4594-900d-be115f006e13\" (UID: \"967804ee-64cc-4594-900d-be115f006e13\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.703842 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967804ee-64cc-4594-900d-be115f006e13-kube-api-access-kgxhm" (OuterVolumeSpecName: "kube-api-access-kgxhm") pod "967804ee-64cc-4594-900d-be115f006e13" (UID: "967804ee-64cc-4594-900d-be115f006e13"). InnerVolumeSpecName "kube-api-access-kgxhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.806443 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgxhm\" (UniqueName: \"kubernetes.io/projected/967804ee-64cc-4594-900d-be115f006e13-kube-api-access-kgxhm\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.809989 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.907003 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkj6w\" (UniqueName: \"kubernetes.io/projected/d285f14a-7c11-4d3c-8abc-188a4e2573ce-kube-api-access-rkj6w\") pod \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.907072 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts\") pod \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\" (UID: \"d285f14a-7c11-4d3c-8abc-188a4e2573ce\") " Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.907788 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d285f14a-7c11-4d3c-8abc-188a4e2573ce" (UID: "d285f14a-7c11-4d3c-8abc-188a4e2573ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.916810 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d285f14a-7c11-4d3c-8abc-188a4e2573ce-kube-api-access-rkj6w" (OuterVolumeSpecName: "kube-api-access-rkj6w") pod "d285f14a-7c11-4d3c-8abc-188a4e2573ce" (UID: "d285f14a-7c11-4d3c-8abc-188a4e2573ce"). InnerVolumeSpecName "kube-api-access-rkj6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:50 crc kubenswrapper[4819]: I0228 03:54:50.998452 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.008417 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkj6w\" (UniqueName: \"kubernetes.io/projected/d285f14a-7c11-4d3c-8abc-188a4e2573ce-kube-api-access-rkj6w\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.008450 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d285f14a-7c11-4d3c-8abc-188a4e2573ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.118025 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-kolla-config\") pod \"81cae302-997a-482b-b76a-90b1172083b1\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.119317 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "81cae302-997a-482b-b76a-90b1172083b1" (UID: "81cae302-997a-482b-b76a-90b1172083b1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.119402 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-config-data-default\") pod \"81cae302-997a-482b-b76a-90b1172083b1\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.119485 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdkww\" (UniqueName: \"kubernetes.io/projected/81cae302-997a-482b-b76a-90b1172083b1-kube-api-access-tdkww\") pod \"81cae302-997a-482b-b76a-90b1172083b1\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.119522 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-operator-scripts\") pod \"81cae302-997a-482b-b76a-90b1172083b1\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.119643 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81cae302-997a-482b-b76a-90b1172083b1-config-data-generated\") pod \"81cae302-997a-482b-b76a-90b1172083b1\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.119735 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"81cae302-997a-482b-b76a-90b1172083b1\" (UID: \"81cae302-997a-482b-b76a-90b1172083b1\") " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121287 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81cae302-997a-482b-b76a-90b1172083b1-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "81cae302-997a-482b-b76a-90b1172083b1" (UID: "81cae302-997a-482b-b76a-90b1172083b1"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121459 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "81cae302-997a-482b-b76a-90b1172083b1" (UID: "81cae302-997a-482b-b76a-90b1172083b1"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121510 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81cae302-997a-482b-b76a-90b1172083b1" (UID: "81cae302-997a-482b-b76a-90b1172083b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121779 4819 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121809 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/81cae302-997a-482b-b76a-90b1172083b1-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121833 4819 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.121852 4819 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/81cae302-997a-482b-b76a-90b1172083b1-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.125542 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81cae302-997a-482b-b76a-90b1172083b1-kube-api-access-tdkww" (OuterVolumeSpecName: "kube-api-access-tdkww") pod "81cae302-997a-482b-b76a-90b1172083b1" (UID: "81cae302-997a-482b-b76a-90b1172083b1"). InnerVolumeSpecName "kube-api-access-tdkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.132060 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "81cae302-997a-482b-b76a-90b1172083b1" (UID: "81cae302-997a-482b-b76a-90b1172083b1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.223547 4819 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.223582 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdkww\" (UniqueName: \"kubernetes.io/projected/81cae302-997a-482b-b76a-90b1172083b1-kube-api-access-tdkww\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.236428 4819 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.325051 4819 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.535392 4819 generic.go:334] "Generic (PLEG): container finished" podID="81cae302-997a-482b-b76a-90b1172083b1" containerID="3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e" exitCode=0 Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.535446 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.535489 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"81cae302-997a-482b-b76a-90b1172083b1","Type":"ContainerDied","Data":"3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e"} Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.535546 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"81cae302-997a-482b-b76a-90b1172083b1","Type":"ContainerDied","Data":"59bdc841d38f540743d985644b78431bf67270a26d2e9239d7053f095d41e60f"} Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.535573 4819 scope.go:117] "RemoveContainer" containerID="3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.538301 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.538294 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystoneee83-account-delete-6nzk7" event={"ID":"d285f14a-7c11-4d3c-8abc-188a4e2573ce","Type":"ContainerDied","Data":"1b16f7d80ed3d27d6b22593ab3eacafc274bc690785ca8f587d9fd3bcbe2883e"} Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.545722 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-92zwp" event={"ID":"967804ee-64cc-4594-900d-be115f006e13","Type":"ContainerDied","Data":"6265ad18864e48e6b8bbad22fc11603ecd8b4c2d24ce7bd7ea8e07da1b920fb3"} Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.545832 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-92zwp" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.562125 4819 scope.go:117] "RemoveContainer" containerID="6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.584465 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.594911 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.601360 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystoneee83-account-delete-6nzk7"] Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.603238 4819 scope.go:117] "RemoveContainer" containerID="3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e" Feb 28 03:54:51 crc kubenswrapper[4819]: E0228 03:54:51.603852 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e\": container with ID starting with 3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e not found: ID does not exist" containerID="3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.603903 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e"} err="failed to get container status \"3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e\": rpc error: code = NotFound desc = could not find container \"3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e\": container with ID starting with 3be6fe1eddb06e92c838323a1365e512654632f29026c5ded30167497cb75c2e not found: ID does not exist" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.603939 4819 scope.go:117] "RemoveContainer" containerID="6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d" Feb 28 03:54:51 crc kubenswrapper[4819]: E0228 03:54:51.604494 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d\": container with ID starting with 6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d not found: ID does not exist" containerID="6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.604540 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d"} err="failed to get container status \"6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d\": rpc error: code = NotFound desc = could not find container \"6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d\": container with ID starting with 6782046ebb5c2945604ce3abc0b8cc4c781246d7d9d04a0f3f289e71a2729e7d not found: ID does not exist" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.604566 4819 scope.go:117] "RemoveContainer" containerID="e5baa0b7f903de16b237641506c9f6d76d37f67bb89fd97b4e62c89ede7cf5da" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.624304 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystoneee83-account-delete-6nzk7"] Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.639806 4819 scope.go:117] "RemoveContainer" containerID="9e4ad6facbc1340ef90d01f1148163fc8e33c7f10755df06a4cad94a2218a18a" Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.639874 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-92zwp"] Feb 28 03:54:51 crc kubenswrapper[4819]: I0228 03:54:51.650916 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-92zwp"] Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.380809 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81cae302-997a-482b-b76a-90b1172083b1" path="/var/lib/kubelet/pods/81cae302-997a-482b-b76a-90b1172083b1/volumes" Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.382148 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" path="/var/lib/kubelet/pods/8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c/volumes" Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.383106 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967804ee-64cc-4594-900d-be115f006e13" path="/var/lib/kubelet/pods/967804ee-64cc-4594-900d-be115f006e13/volumes" Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.385489 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" path="/var/lib/kubelet/pods/a18d0ed2-f5db-4f32-b635-7956eeee1f01/volumes" Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.386964 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" path="/var/lib/kubelet/pods/d285f14a-7c11-4d3c-8abc-188a4e2573ce/volumes" Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.441635 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5"] Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.441880 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" podUID="58dd9e5c-5ce0-4cef-b287-413997f8aa49" containerName="operator" containerID="cri-o://8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918" gracePeriod=10 Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.764240 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-whnjt"] Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.764718 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" podUID="96706889-ea67-4e43-ab82-6a6026090647" containerName="registry-server" containerID="cri-o://40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378" gracePeriod=30 Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.788346 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x"] Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.793266 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590z667x"] Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.875652 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.958012 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9ssk\" (UniqueName: \"kubernetes.io/projected/58dd9e5c-5ce0-4cef-b287-413997f8aa49-kube-api-access-z9ssk\") pod \"58dd9e5c-5ce0-4cef-b287-413997f8aa49\" (UID: \"58dd9e5c-5ce0-4cef-b287-413997f8aa49\") " Feb 28 03:54:52 crc kubenswrapper[4819]: I0228 03:54:52.962885 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dd9e5c-5ce0-4cef-b287-413997f8aa49-kube-api-access-z9ssk" (OuterVolumeSpecName: "kube-api-access-z9ssk") pod "58dd9e5c-5ce0-4cef-b287-413997f8aa49" (UID: "58dd9e5c-5ce0-4cef-b287-413997f8aa49"). InnerVolumeSpecName "kube-api-access-z9ssk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.059422 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9ssk\" (UniqueName: \"kubernetes.io/projected/58dd9e5c-5ce0-4cef-b287-413997f8aa49-kube-api-access-z9ssk\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.156524 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.261162 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7slbz\" (UniqueName: \"kubernetes.io/projected/96706889-ea67-4e43-ab82-6a6026090647-kube-api-access-7slbz\") pod \"96706889-ea67-4e43-ab82-6a6026090647\" (UID: \"96706889-ea67-4e43-ab82-6a6026090647\") " Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.264202 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96706889-ea67-4e43-ab82-6a6026090647-kube-api-access-7slbz" (OuterVolumeSpecName: "kube-api-access-7slbz") pod "96706889-ea67-4e43-ab82-6a6026090647" (UID: "96706889-ea67-4e43-ab82-6a6026090647"). InnerVolumeSpecName "kube-api-access-7slbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.363384 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7slbz\" (UniqueName: \"kubernetes.io/projected/96706889-ea67-4e43-ab82-6a6026090647-kube-api-access-7slbz\") on node \"crc\" DevicePath \"\"" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.570311 4819 generic.go:334] "Generic (PLEG): container finished" podID="96706889-ea67-4e43-ab82-6a6026090647" containerID="40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378" exitCode=0 Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.570396 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.570478 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" event={"ID":"96706889-ea67-4e43-ab82-6a6026090647","Type":"ContainerDied","Data":"40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378"} Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.570582 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-whnjt" event={"ID":"96706889-ea67-4e43-ab82-6a6026090647","Type":"ContainerDied","Data":"2d0d2b746567876aabdecd7c89f7e1e03a4831d8f6aad332f0604617fe0163d1"} Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.570617 4819 scope.go:117] "RemoveContainer" containerID="40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.573731 4819 generic.go:334] "Generic (PLEG): container finished" podID="58dd9e5c-5ce0-4cef-b287-413997f8aa49" containerID="8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918" exitCode=0 Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.573827 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" event={"ID":"58dd9e5c-5ce0-4cef-b287-413997f8aa49","Type":"ContainerDied","Data":"8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918"} Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.573999 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" event={"ID":"58dd9e5c-5ce0-4cef-b287-413997f8aa49","Type":"ContainerDied","Data":"28b21015957886a85e3600963c2e22becd88991b7598157d9ac33064dc26598e"} Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.574114 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.606267 4819 scope.go:117] "RemoveContainer" containerID="40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378" Feb 28 03:54:53 crc kubenswrapper[4819]: E0228 03:54:53.607038 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378\": container with ID starting with 40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378 not found: ID does not exist" containerID="40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.607102 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378"} err="failed to get container status \"40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378\": rpc error: code = NotFound desc = could not find container \"40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378\": container with ID starting with 40e7469da43c9726907d8d5cdba7135a7aa40802eada55fc0df56831b6cc8378 not found: ID does not exist" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.607142 4819 scope.go:117] "RemoveContainer" containerID="8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.625351 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-whnjt"] Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.633055 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-whnjt"] Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.654110 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5"] Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.660422 4819 scope.go:117] "RemoveContainer" containerID="8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918" Feb 28 03:54:53 crc kubenswrapper[4819]: E0228 03:54:53.661099 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918\": container with ID starting with 8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918 not found: ID does not exist" containerID="8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.661153 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918"} err="failed to get container status \"8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918\": rpc error: code = NotFound desc = could not find container \"8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918\": container with ID starting with 8c5e93b816467768a34367d72874489cf1a63faf240e3a6183e3ab5f8c855918 not found: ID does not exist" Feb 28 03:54:53 crc kubenswrapper[4819]: I0228 03:54:53.663499 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gzvt5"] Feb 28 03:54:54 crc kubenswrapper[4819]: I0228 03:54:54.379951 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58dd9e5c-5ce0-4cef-b287-413997f8aa49" path="/var/lib/kubelet/pods/58dd9e5c-5ce0-4cef-b287-413997f8aa49/volumes" Feb 28 03:54:54 crc kubenswrapper[4819]: I0228 03:54:54.381031 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a44caf-c5d8-4b98-ad4f-16833053b82b" path="/var/lib/kubelet/pods/65a44caf-c5d8-4b98-ad4f-16833053b82b/volumes" Feb 28 03:54:54 crc kubenswrapper[4819]: I0228 03:54:54.382151 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96706889-ea67-4e43-ab82-6a6026090647" path="/var/lib/kubelet/pods/96706889-ea67-4e43-ab82-6a6026090647/volumes" Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.466402 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65c67cfc-55thv"] Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.467146 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" podUID="fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" containerName="manager" containerID="cri-o://0bb9b650991cd37cb1b8f87b559e76cbd1ec40205e835fb260cf2999742c3502" gracePeriod=10 Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.649534 4819 generic.go:334] "Generic (PLEG): container finished" podID="fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" containerID="0bb9b650991cd37cb1b8f87b559e76cbd1ec40205e835fb260cf2999742c3502" exitCode=0 Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.649616 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" event={"ID":"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e","Type":"ContainerDied","Data":"0bb9b650991cd37cb1b8f87b559e76cbd1ec40205e835fb260cf2999742c3502"} Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.743484 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-wpbg5"] Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.743724 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-wpbg5" podUID="1d272d93-35f0-4ece-b028-e72a1b0d7b6b" containerName="registry-server" containerID="cri-o://d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca" gracePeriod=30 Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.778931 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48"] Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.783407 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/79bc65e7ec23ff9ffb8d038f6fd78b8e209a88fbc0eb590e06498779037zr48"] Feb 28 03:55:02 crc kubenswrapper[4819]: I0228 03:55:02.881236 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.000272 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-apiservice-cert\") pod \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.000328 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpktt\" (UniqueName: \"kubernetes.io/projected/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-kube-api-access-zpktt\") pod \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.000351 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-webhook-cert\") pod \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\" (UID: \"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e\") " Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.008310 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" (UID: "fd2cbc5c-bdc3-411a-b2b3-e195ed65737e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.008497 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" (UID: "fd2cbc5c-bdc3-411a-b2b3-e195ed65737e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.009067 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-kube-api-access-zpktt" (OuterVolumeSpecName: "kube-api-access-zpktt") pod "fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" (UID: "fd2cbc5c-bdc3-411a-b2b3-e195ed65737e"). InnerVolumeSpecName "kube-api-access-zpktt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.101369 4819 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.101396 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpktt\" (UniqueName: \"kubernetes.io/projected/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-kube-api-access-zpktt\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.101407 4819 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.101758 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.202093 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6swdx\" (UniqueName: \"kubernetes.io/projected/1d272d93-35f0-4ece-b028-e72a1b0d7b6b-kube-api-access-6swdx\") pod \"1d272d93-35f0-4ece-b028-e72a1b0d7b6b\" (UID: \"1d272d93-35f0-4ece-b028-e72a1b0d7b6b\") " Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.206594 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d272d93-35f0-4ece-b028-e72a1b0d7b6b-kube-api-access-6swdx" (OuterVolumeSpecName: "kube-api-access-6swdx") pod "1d272d93-35f0-4ece-b028-e72a1b0d7b6b" (UID: "1d272d93-35f0-4ece-b028-e72a1b0d7b6b"). InnerVolumeSpecName "kube-api-access-6swdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.303648 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6swdx\" (UniqueName: \"kubernetes.io/projected/1d272d93-35f0-4ece-b028-e72a1b0d7b6b-kube-api-access-6swdx\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.658517 4819 generic.go:334] "Generic (PLEG): container finished" podID="1d272d93-35f0-4ece-b028-e72a1b0d7b6b" containerID="d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca" exitCode=0 Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.658555 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-wpbg5" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.658577 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wpbg5" event={"ID":"1d272d93-35f0-4ece-b028-e72a1b0d7b6b","Type":"ContainerDied","Data":"d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca"} Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.658603 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-wpbg5" event={"ID":"1d272d93-35f0-4ece-b028-e72a1b0d7b6b","Type":"ContainerDied","Data":"4b00cd4166bd676615c661715e40b3c478ea3d7fe88459d49d88cade5d266e0c"} Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.658646 4819 scope.go:117] "RemoveContainer" containerID="d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.661104 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.661020 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65c67cfc-55thv" event={"ID":"fd2cbc5c-bdc3-411a-b2b3-e195ed65737e","Type":"ContainerDied","Data":"df9658a1ba77981e1292a68491eb5e6a1e9e360991f200dbb8d3ba0ec05dd6ae"} Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.686917 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-wpbg5"] Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.687801 4819 scope.go:117] "RemoveContainer" containerID="d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca" Feb 28 03:55:03 crc kubenswrapper[4819]: E0228 03:55:03.688297 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca\": container with ID starting with d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca not found: ID does not exist" containerID="d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.688332 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca"} err="failed to get container status \"d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca\": rpc error: code = NotFound desc = could not find container \"d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca\": container with ID starting with d00f6a59033f8b37791a8b2a6ba0d653cb57d62dc4e49ee4648d03bda15809ca not found: ID does not exist" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.688370 4819 scope.go:117] "RemoveContainer" containerID="0bb9b650991cd37cb1b8f87b559e76cbd1ec40205e835fb260cf2999742c3502" Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.691208 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-wpbg5"] Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.702085 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65c67cfc-55thv"] Feb 28 03:55:03 crc kubenswrapper[4819]: I0228 03:55:03.709016 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65c67cfc-55thv"] Feb 28 03:55:04 crc kubenswrapper[4819]: I0228 03:55:04.397425 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d272d93-35f0-4ece-b028-e72a1b0d7b6b" path="/var/lib/kubelet/pods/1d272d93-35f0-4ece-b028-e72a1b0d7b6b/volumes" Feb 28 03:55:04 crc kubenswrapper[4819]: I0228 03:55:04.398866 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f3677a-e092-464a-925c-c0792c7590da" path="/var/lib/kubelet/pods/e1f3677a-e092-464a-925c-c0792c7590da/volumes" Feb 28 03:55:04 crc kubenswrapper[4819]: I0228 03:55:04.400653 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" path="/var/lib/kubelet/pods/fd2cbc5c-bdc3-411a-b2b3-e195ed65737e/volumes" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.378493 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf"] Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.378743 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" podUID="1c1220fd-a97c-40ec-8c9b-85fa123a3c61" containerName="manager" containerID="cri-o://0700b20e8409cfdecd94e9ae35aa169a5b97fb34d3f8bf4446257dbcbad3faa9" gracePeriod=10 Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.590813 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-vlw7b"] Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.591427 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-vlw7b" podUID="790eff89-c9e4-47ac-8b81-c7276fbdb085" containerName="registry-server" containerID="cri-o://6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5" gracePeriod=30 Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.622349 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f"] Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.626018 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/210ba99a0c584e776560c767af713a94c183f0b698e37283cc7b4793dcfcx2f"] Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.683768 4819 generic.go:334] "Generic (PLEG): container finished" podID="1c1220fd-a97c-40ec-8c9b-85fa123a3c61" containerID="0700b20e8409cfdecd94e9ae35aa169a5b97fb34d3f8bf4446257dbcbad3faa9" exitCode=0 Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.683815 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" event={"ID":"1c1220fd-a97c-40ec-8c9b-85fa123a3c61","Type":"ContainerDied","Data":"0700b20e8409cfdecd94e9ae35aa169a5b97fb34d3f8bf4446257dbcbad3faa9"} Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.789724 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.839198 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvchz\" (UniqueName: \"kubernetes.io/projected/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-kube-api-access-lvchz\") pod \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.839265 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-webhook-cert\") pod \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.839287 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-apiservice-cert\") pod \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\" (UID: \"1c1220fd-a97c-40ec-8c9b-85fa123a3c61\") " Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.845033 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-kube-api-access-lvchz" (OuterVolumeSpecName: "kube-api-access-lvchz") pod "1c1220fd-a97c-40ec-8c9b-85fa123a3c61" (UID: "1c1220fd-a97c-40ec-8c9b-85fa123a3c61"). InnerVolumeSpecName "kube-api-access-lvchz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.848401 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "1c1220fd-a97c-40ec-8c9b-85fa123a3c61" (UID: "1c1220fd-a97c-40ec-8c9b-85fa123a3c61"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.859438 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "1c1220fd-a97c-40ec-8c9b-85fa123a3c61" (UID: "1c1220fd-a97c-40ec-8c9b-85fa123a3c61"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.939407 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.940308 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvchz\" (UniqueName: \"kubernetes.io/projected/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-kube-api-access-lvchz\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.940326 4819 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:05 crc kubenswrapper[4819]: I0228 03:55:05.940336 4819 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c1220fd-a97c-40ec-8c9b-85fa123a3c61-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.041658 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzvnl\" (UniqueName: \"kubernetes.io/projected/790eff89-c9e4-47ac-8b81-c7276fbdb085-kube-api-access-vzvnl\") pod \"790eff89-c9e4-47ac-8b81-c7276fbdb085\" (UID: \"790eff89-c9e4-47ac-8b81-c7276fbdb085\") " Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.046312 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790eff89-c9e4-47ac-8b81-c7276fbdb085-kube-api-access-vzvnl" (OuterVolumeSpecName: "kube-api-access-vzvnl") pod "790eff89-c9e4-47ac-8b81-c7276fbdb085" (UID: "790eff89-c9e4-47ac-8b81-c7276fbdb085"). InnerVolumeSpecName "kube-api-access-vzvnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.143995 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzvnl\" (UniqueName: \"kubernetes.io/projected/790eff89-c9e4-47ac-8b81-c7276fbdb085-kube-api-access-vzvnl\") on node \"crc\" DevicePath \"\"" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.376799 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ae2373-5f2b-4f8e-8fc7-5d440483017d" path="/var/lib/kubelet/pods/f0ae2373-5f2b-4f8e-8fc7-5d440483017d/volumes" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.697232 4819 generic.go:334] "Generic (PLEG): container finished" podID="790eff89-c9e4-47ac-8b81-c7276fbdb085" containerID="6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5" exitCode=0 Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.697485 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-vlw7b" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.698440 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vlw7b" event={"ID":"790eff89-c9e4-47ac-8b81-c7276fbdb085","Type":"ContainerDied","Data":"6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5"} Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.698789 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-vlw7b" event={"ID":"790eff89-c9e4-47ac-8b81-c7276fbdb085","Type":"ContainerDied","Data":"74506db97c1c15ccc13dce31c74b3e30074932e3f9a270260fb6b42880b3e80b"} Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.698807 4819 scope.go:117] "RemoveContainer" containerID="6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.701549 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" event={"ID":"1c1220fd-a97c-40ec-8c9b-85fa123a3c61","Type":"ContainerDied","Data":"9cba7400e2e300df308c0b87f775c1e9583eb30c42803ebfaca5b2f4b5c0ce7f"} Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.701759 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.737834 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf"] Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.746917 4819 scope.go:117] "RemoveContainer" containerID="6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5" Feb 28 03:55:06 crc kubenswrapper[4819]: E0228 03:55:06.747566 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5\": container with ID starting with 6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5 not found: ID does not exist" containerID="6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.747731 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5"} err="failed to get container status \"6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5\": rpc error: code = NotFound desc = could not find container \"6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5\": container with ID starting with 6a7309400295035aebd24a1ddb54962837a772813dee8774c6f1298adc8811b5 not found: ID does not exist" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.747891 4819 scope.go:117] "RemoveContainer" containerID="0700b20e8409cfdecd94e9ae35aa169a5b97fb34d3f8bf4446257dbcbad3faa9" Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.750158 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-69797db6c9-792tf"] Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.757868 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-vlw7b"] Feb 28 03:55:06 crc kubenswrapper[4819]: I0228 03:55:06.761705 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-vlw7b"] Feb 28 03:55:08 crc kubenswrapper[4819]: I0228 03:55:08.396350 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1220fd-a97c-40ec-8c9b-85fa123a3c61" path="/var/lib/kubelet/pods/1c1220fd-a97c-40ec-8c9b-85fa123a3c61/volumes" Feb 28 03:55:08 crc kubenswrapper[4819]: I0228 03:55:08.396933 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790eff89-c9e4-47ac-8b81-c7276fbdb085" path="/var/lib/kubelet/pods/790eff89-c9e4-47ac-8b81-c7276fbdb085/volumes" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.665952 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7hsmw/must-gather-wtqph"] Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.666923 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.666940 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.666950 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" containerName="setup-container" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.666958 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" containerName="setup-container" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.666971 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="mysql-bootstrap" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.666980 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="mysql-bootstrap" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.666993 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49fd157-5376-4da3-8d0d-11a9218ce42b" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.666999 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49fd157-5376-4da3-8d0d-11a9218ce42b" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667009 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d272d93-35f0-4ece-b028-e72a1b0d7b6b" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667019 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d272d93-35f0-4ece-b028-e72a1b0d7b6b" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667028 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerName="mariadb-account-delete" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667035 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerName="mariadb-account-delete" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667043 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1220fd-a97c-40ec-8c9b-85fa123a3c61" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667050 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1220fd-a97c-40ec-8c9b-85fa123a3c61" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667060 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790eff89-c9e4-47ac-8b81-c7276fbdb085" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667067 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="790eff89-c9e4-47ac-8b81-c7276fbdb085" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667078 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96706889-ea67-4e43-ab82-6a6026090647" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667084 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="96706889-ea67-4e43-ab82-6a6026090647" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667095 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c684da03-6893-45f7-833c-2e71ad6c7e47" containerName="memcached" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667101 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c684da03-6893-45f7-833c-2e71ad6c7e47" containerName="memcached" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667112 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cae302-997a-482b-b76a-90b1172083b1" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667119 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cae302-997a-482b-b76a-90b1172083b1" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667128 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dd9e5c-5ce0-4cef-b287-413997f8aa49" containerName="operator" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667135 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dd9e5c-5ce0-4cef-b287-413997f8aa49" containerName="operator" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667143 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667152 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667166 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967804ee-64cc-4594-900d-be115f006e13" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667174 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="967804ee-64cc-4594-900d-be115f006e13" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667187 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" containerName="rabbitmq" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667195 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" containerName="rabbitmq" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667203 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667210 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667221 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667228 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667240 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerName="mysql-bootstrap" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667267 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerName="mysql-bootstrap" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667281 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667288 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667298 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cae302-997a-482b-b76a-90b1172083b1" containerName="mysql-bootstrap" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667304 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cae302-997a-482b-b76a-90b1172083b1" containerName="mysql-bootstrap" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667314 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66de22e-9900-45fe-b074-455829b4084a" containerName="keystone-api" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667320 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66de22e-9900-45fe-b074-455829b4084a" containerName="keystone-api" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667442 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66de22e-9900-45fe-b074-455829b4084a" containerName="keystone-api" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667456 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ee77d5-3f8c-42b9-8025-6c6c73fa17fc" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667464 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1220fd-a97c-40ec-8c9b-85fa123a3c61" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667475 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="967804ee-64cc-4594-900d-be115f006e13" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667484 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerName="mariadb-account-delete" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667492 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="790eff89-c9e4-47ac-8b81-c7276fbdb085" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667499 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="96706889-ea67-4e43-ab82-6a6026090647" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667507 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8575a62-7205-495b-80ed-2c715e87cc72" containerName="rabbitmq" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667515 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cae302-997a-482b-b76a-90b1172083b1" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667522 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2cbc5c-bdc3-411a-b2b3-e195ed65737e" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667531 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b16747d-1b6b-44ee-896e-0ead9587deeb" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667541 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18d0ed2-f5db-4f32-b635-7956eeee1f01" containerName="galera" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667551 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49fd157-5376-4da3-8d0d-11a9218ce42b" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667562 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eec5bc4-ad59-462f-9fb4-5811b0ad9e5c" containerName="manager" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667577 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d272d93-35f0-4ece-b028-e72a1b0d7b6b" containerName="registry-server" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667588 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c684da03-6893-45f7-833c-2e71ad6c7e47" containerName="memcached" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667600 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerName="mariadb-account-delete" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667612 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="58dd9e5c-5ce0-4cef-b287-413997f8aa49" containerName="operator" Feb 28 03:55:17 crc kubenswrapper[4819]: E0228 03:55:17.667731 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerName="mariadb-account-delete" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.667740 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="d285f14a-7c11-4d3c-8abc-188a4e2573ce" containerName="mariadb-account-delete" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.668341 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.670574 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7hsmw"/"kube-root-ca.crt" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.671783 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7hsmw"/"openshift-service-ca.crt" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.686240 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7hsmw/must-gather-wtqph"] Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.722467 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cccb0a9-d379-4387-af64-270369275bc6-must-gather-output\") pod \"must-gather-wtqph\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.722791 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826mn\" (UniqueName: \"kubernetes.io/projected/6cccb0a9-d379-4387-af64-270369275bc6-kube-api-access-826mn\") pod \"must-gather-wtqph\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.823944 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cccb0a9-d379-4387-af64-270369275bc6-must-gather-output\") pod \"must-gather-wtqph\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.824078 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-826mn\" (UniqueName: \"kubernetes.io/projected/6cccb0a9-d379-4387-af64-270369275bc6-kube-api-access-826mn\") pod \"must-gather-wtqph\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.824460 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cccb0a9-d379-4387-af64-270369275bc6-must-gather-output\") pod \"must-gather-wtqph\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.841393 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-826mn\" (UniqueName: \"kubernetes.io/projected/6cccb0a9-d379-4387-af64-270369275bc6-kube-api-access-826mn\") pod \"must-gather-wtqph\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:17 crc kubenswrapper[4819]: I0228 03:55:17.982841 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:55:18 crc kubenswrapper[4819]: I0228 03:55:18.462214 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7hsmw/must-gather-wtqph"] Feb 28 03:55:18 crc kubenswrapper[4819]: I0228 03:55:18.474510 4819 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 03:55:18 crc kubenswrapper[4819]: I0228 03:55:18.763454 4819 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/keystone-bf7f56bd7-gdd5v" podUID="c66de22e-9900-45fe-b074-455829b4084a" containerName="keystone-api" probeResult="failure" output="Get \"http://10.217.0.87:5000/v3\": dial tcp 10.217.0.87:5000: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 28 03:55:18 crc kubenswrapper[4819]: I0228 03:55:18.805762 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hsmw/must-gather-wtqph" event={"ID":"6cccb0a9-d379-4387-af64-270369275bc6","Type":"ContainerStarted","Data":"fce1cc05423191fe26601c59ea174365b8ae77ccea50caf9f3787cd7b6a7d683"} Feb 28 03:55:25 crc kubenswrapper[4819]: I0228 03:55:25.866550 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hsmw/must-gather-wtqph" event={"ID":"6cccb0a9-d379-4387-af64-270369275bc6","Type":"ContainerStarted","Data":"d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53"} Feb 28 03:55:25 crc kubenswrapper[4819]: I0228 03:55:25.867082 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hsmw/must-gather-wtqph" event={"ID":"6cccb0a9-d379-4387-af64-270369275bc6","Type":"ContainerStarted","Data":"cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2"} Feb 28 03:55:25 crc kubenswrapper[4819]: I0228 03:55:25.889201 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7hsmw/must-gather-wtqph" podStartSLOduration=2.696718696 podStartE2EDuration="8.889181229s" podCreationTimestamp="2026-02-28 03:55:17 +0000 UTC" firstStartedPulling="2026-02-28 03:55:18.474442566 +0000 UTC m=+1256.940011464" lastFinishedPulling="2026-02-28 03:55:24.666905129 +0000 UTC m=+1263.132473997" observedRunningTime="2026-02-28 03:55:25.886163836 +0000 UTC m=+1264.351732734" watchObservedRunningTime="2026-02-28 03:55:25.889181229 +0000 UTC m=+1264.354750117" Feb 28 03:55:50 crc kubenswrapper[4819]: I0228 03:55:50.571599 4819 scope.go:117] "RemoveContainer" containerID="77b34b3263acbaee8b2d070e595c53bed9567865efe5f9db59c0974776600bc9" Feb 28 03:55:50 crc kubenswrapper[4819]: I0228 03:55:50.607165 4819 scope.go:117] "RemoveContainer" containerID="0ef72527af940da10e8c69b1cc4add2231f6627f2cc030c5dbcafb50c003d46b" Feb 28 03:55:50 crc kubenswrapper[4819]: I0228 03:55:50.625139 4819 scope.go:117] "RemoveContainer" containerID="4898dfae76fcdf0b68cdc60b454d69c64c8090aea908fb497271284d4811cd23" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.151125 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537516-rbhzb"] Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.158341 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.166779 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537516-rbhzb"] Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.197522 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.197620 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.197678 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.231598 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knh5g\" (UniqueName: \"kubernetes.io/projected/303e7aea-596a-4429-a4d1-e88b6737bd9b-kube-api-access-knh5g\") pod \"auto-csr-approver-29537516-rbhzb\" (UID: \"303e7aea-596a-4429-a4d1-e88b6737bd9b\") " pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.332836 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knh5g\" (UniqueName: \"kubernetes.io/projected/303e7aea-596a-4429-a4d1-e88b6737bd9b-kube-api-access-knh5g\") pod \"auto-csr-approver-29537516-rbhzb\" (UID: \"303e7aea-596a-4429-a4d1-e88b6737bd9b\") " pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.366761 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knh5g\" (UniqueName: \"kubernetes.io/projected/303e7aea-596a-4429-a4d1-e88b6737bd9b-kube-api-access-knh5g\") pod \"auto-csr-approver-29537516-rbhzb\" (UID: \"303e7aea-596a-4429-a4d1-e88b6737bd9b\") " pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.508204 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.764393 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537516-rbhzb"] Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.834135 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:56:00 crc kubenswrapper[4819]: I0228 03:56:00.834213 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:56:01 crc kubenswrapper[4819]: I0228 03:56:01.164108 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" event={"ID":"303e7aea-596a-4429-a4d1-e88b6737bd9b","Type":"ContainerStarted","Data":"e6246c265744a619ad6fc6acd7545ee4b0cf20e57daef8917382677d0c63ce61"} Feb 28 03:56:03 crc kubenswrapper[4819]: I0228 03:56:03.180845 4819 generic.go:334] "Generic (PLEG): container finished" podID="303e7aea-596a-4429-a4d1-e88b6737bd9b" containerID="2b5db0da62d02cc5c546f3fa90b2354a54c282131e81aced30a9d5dfd7646a76" exitCode=0 Feb 28 03:56:03 crc kubenswrapper[4819]: I0228 03:56:03.180941 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" event={"ID":"303e7aea-596a-4429-a4d1-e88b6737bd9b","Type":"ContainerDied","Data":"2b5db0da62d02cc5c546f3fa90b2354a54c282131e81aced30a9d5dfd7646a76"} Feb 28 03:56:04 crc kubenswrapper[4819]: I0228 03:56:04.477033 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:04 crc kubenswrapper[4819]: I0228 03:56:04.596148 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knh5g\" (UniqueName: \"kubernetes.io/projected/303e7aea-596a-4429-a4d1-e88b6737bd9b-kube-api-access-knh5g\") pod \"303e7aea-596a-4429-a4d1-e88b6737bd9b\" (UID: \"303e7aea-596a-4429-a4d1-e88b6737bd9b\") " Feb 28 03:56:04 crc kubenswrapper[4819]: I0228 03:56:04.612040 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303e7aea-596a-4429-a4d1-e88b6737bd9b-kube-api-access-knh5g" (OuterVolumeSpecName: "kube-api-access-knh5g") pod "303e7aea-596a-4429-a4d1-e88b6737bd9b" (UID: "303e7aea-596a-4429-a4d1-e88b6737bd9b"). InnerVolumeSpecName "kube-api-access-knh5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:56:04 crc kubenswrapper[4819]: I0228 03:56:04.697639 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knh5g\" (UniqueName: \"kubernetes.io/projected/303e7aea-596a-4429-a4d1-e88b6737bd9b-kube-api-access-knh5g\") on node \"crc\" DevicePath \"\"" Feb 28 03:56:05 crc kubenswrapper[4819]: I0228 03:56:05.193942 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" event={"ID":"303e7aea-596a-4429-a4d1-e88b6737bd9b","Type":"ContainerDied","Data":"e6246c265744a619ad6fc6acd7545ee4b0cf20e57daef8917382677d0c63ce61"} Feb 28 03:56:05 crc kubenswrapper[4819]: I0228 03:56:05.193990 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6246c265744a619ad6fc6acd7545ee4b0cf20e57daef8917382677d0c63ce61" Feb 28 03:56:05 crc kubenswrapper[4819]: I0228 03:56:05.194029 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537516-rbhzb" Feb 28 03:56:05 crc kubenswrapper[4819]: I0228 03:56:05.534983 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537510-8mv82"] Feb 28 03:56:05 crc kubenswrapper[4819]: I0228 03:56:05.537721 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537510-8mv82"] Feb 28 03:56:06 crc kubenswrapper[4819]: I0228 03:56:06.380227 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd5b8da-92d4-4701-8aac-c77e83d76567" path="/var/lib/kubelet/pods/dcd5b8da-92d4-4701-8aac-c77e83d76567/volumes" Feb 28 03:56:19 crc kubenswrapper[4819]: I0228 03:56:19.743723 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qv8xl_5e568a4f-33f8-447b-840c-dc560774878d/control-plane-machine-set-operator/0.log" Feb 28 03:56:19 crc kubenswrapper[4819]: I0228 03:56:19.885453 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tdcjf_b6a78496-9606-4297-b022-286a969e9ea6/kube-rbac-proxy/0.log" Feb 28 03:56:19 crc kubenswrapper[4819]: I0228 03:56:19.907796 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tdcjf_b6a78496-9606-4297-b022-286a969e9ea6/machine-api-operator/0.log" Feb 28 03:56:30 crc kubenswrapper[4819]: I0228 03:56:30.834894 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:56:30 crc kubenswrapper[4819]: I0228 03:56:30.835861 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.504902 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgqvl_afd495b6-4c71-455d-ba10-061bbf630cc5/kube-rbac-proxy/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.528473 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgqvl_afd495b6-4c71-455d-ba10-061bbf630cc5/controller/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.681664 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.828629 4819 scope.go:117] "RemoveContainer" containerID="71f4ab953d963d5206b80a3b933886691381ae4197e36557863800ff386245c3" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.844276 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.865339 4819 scope.go:117] "RemoveContainer" containerID="a07edff885000ba80b026778c2a5052473c759caa180df7643b8fc5aedbe0105" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.872520 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.877630 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.886268 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.889072 4819 scope.go:117] "RemoveContainer" containerID="aba080e6484c8527072a5e8957dd2736f9d3132d53bf1440cf98659597c79d0f" Feb 28 03:56:50 crc kubenswrapper[4819]: I0228 03:56:50.912965 4819 scope.go:117] "RemoveContainer" containerID="c0d50663bbe2d666c36a9da32faba86cc97b14ad031d079b7ce21c3c84ad2ea7" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.041621 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.048081 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.089101 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.091788 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.251517 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/controller/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.253462 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.258559 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.261172 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.424734 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/kube-rbac-proxy/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.470180 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/kube-rbac-proxy-frr/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.470204 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/frr-metrics/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.591733 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/reloader/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.639202 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-xwnrs_d65903fa-28c3-4b9d-890d-7b605a91f0d8/frr-k8s-webhook-server/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.775859 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/frr/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.875970 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d5484c9c9-4vfr2_4384095a-345c-400a-bd62-0f8ca53b1ea3/manager/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.945066 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-644d79b54d-t68wv_55049d53-0610-4ce3-843d-209930ea1421/webhook-server/0.log" Feb 28 03:56:51 crc kubenswrapper[4819]: I0228 03:56:51.994896 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zt2hg_3257ff95-9460-49e8-8ae9-9758e419abee/kube-rbac-proxy/0.log" Feb 28 03:56:52 crc kubenswrapper[4819]: I0228 03:56:52.127428 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zt2hg_3257ff95-9460-49e8-8ae9-9758e419abee/speaker/0.log" Feb 28 03:57:00 crc kubenswrapper[4819]: I0228 03:57:00.833975 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:57:00 crc kubenswrapper[4819]: I0228 03:57:00.834828 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:57:00 crc kubenswrapper[4819]: I0228 03:57:00.834894 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 03:57:00 crc kubenswrapper[4819]: I0228 03:57:00.835752 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 03:57:00 crc kubenswrapper[4819]: I0228 03:57:00.835916 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101" gracePeriod=600 Feb 28 03:57:00 crc kubenswrapper[4819]: E0228 03:57:00.973545 4819 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ad11c1_0eb7_4064_bb39_3ffb389efb90.slice/crio-conmon-9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101.scope\": RecentStats: unable to find data in memory cache]" Feb 28 03:57:01 crc kubenswrapper[4819]: I0228 03:57:01.608488 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101" exitCode=0 Feb 28 03:57:01 crc kubenswrapper[4819]: I0228 03:57:01.609092 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101"} Feb 28 03:57:01 crc kubenswrapper[4819]: I0228 03:57:01.609130 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"9a6196e1947fe2ebe77dd10c343260071400941456b6e5fa2ac8b053fa27f275"} Feb 28 03:57:01 crc kubenswrapper[4819]: I0228 03:57:01.609158 4819 scope.go:117] "RemoveContainer" containerID="2be5a3de849f4caa81a3f6eb2371d580108119159dd0203e877d29c0441c1708" Feb 28 03:57:19 crc kubenswrapper[4819]: I0228 03:57:19.766538 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-utilities/0.log" Feb 28 03:57:19 crc kubenswrapper[4819]: I0228 03:57:19.942709 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-content/0.log" Feb 28 03:57:19 crc kubenswrapper[4819]: I0228 03:57:19.949656 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-content/0.log" Feb 28 03:57:19 crc kubenswrapper[4819]: I0228 03:57:19.957588 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-utilities/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.132191 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-utilities/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.132805 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-content/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.299159 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-utilities/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.399106 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/registry-server/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.429630 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-content/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.454938 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-utilities/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.488301 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-content/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.618302 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-utilities/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.647537 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-content/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.831817 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/util/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.897590 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/registry-server/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.964626 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/util/0.log" Feb 28 03:57:20 crc kubenswrapper[4819]: I0228 03:57:20.968318 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/pull/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.004631 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/pull/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.144540 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/util/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.160653 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/extract/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.173456 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/pull/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.341472 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-utilities/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.350465 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wnnt5_15512f8d-a53e-47cb-9b22-b8f8f410d65d/marketplace-operator/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.528763 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-utilities/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.533679 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-content/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.553885 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-content/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.664101 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-content/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.720826 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-utilities/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.747155 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/registry-server/0.log" Feb 28 03:57:21 crc kubenswrapper[4819]: I0228 03:57:21.834344 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-utilities/0.log" Feb 28 03:57:22 crc kubenswrapper[4819]: I0228 03:57:22.000914 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-utilities/0.log" Feb 28 03:57:22 crc kubenswrapper[4819]: I0228 03:57:22.043295 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-content/0.log" Feb 28 03:57:22 crc kubenswrapper[4819]: I0228 03:57:22.045276 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-content/0.log" Feb 28 03:57:22 crc kubenswrapper[4819]: I0228 03:57:22.178515 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-content/0.log" Feb 28 03:57:22 crc kubenswrapper[4819]: I0228 03:57:22.234387 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-utilities/0.log" Feb 28 03:57:22 crc kubenswrapper[4819]: I0228 03:57:22.505349 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/registry-server/0.log" Feb 28 03:57:50 crc kubenswrapper[4819]: I0228 03:57:50.983580 4819 scope.go:117] "RemoveContainer" containerID="7cb67bd71228f5246ee73d9ec3627ce72963ad2cc2b1308cef1dfeb60b347973" Feb 28 03:57:51 crc kubenswrapper[4819]: I0228 03:57:51.017477 4819 scope.go:117] "RemoveContainer" containerID="22652fedb265636c3a9ade8a81db4e542658bbf7662ee83c711e12c19a56902c" Feb 28 03:57:51 crc kubenswrapper[4819]: I0228 03:57:51.055006 4819 scope.go:117] "RemoveContainer" containerID="523d8d766f60d5aa31d4dc3fd74b03f3f3318e46b28e33be6a2ff5a06367f63f" Feb 28 03:57:51 crc kubenswrapper[4819]: I0228 03:57:51.079806 4819 scope.go:117] "RemoveContainer" containerID="c82b7d129f999d46def29cb424699c9eda5294bd13fd03ca32168648bfc810ad" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.156892 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537518-7v29k"] Feb 28 03:58:00 crc kubenswrapper[4819]: E0228 03:58:00.157603 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303e7aea-596a-4429-a4d1-e88b6737bd9b" containerName="oc" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.157626 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="303e7aea-596a-4429-a4d1-e88b6737bd9b" containerName="oc" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.157841 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="303e7aea-596a-4429-a4d1-e88b6737bd9b" containerName="oc" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.158430 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.161767 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.162159 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.163409 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.170290 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537518-7v29k"] Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.289224 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrgw2\" (UniqueName: \"kubernetes.io/projected/115b12b9-5377-4f22-b13b-a5aaf42dd570-kube-api-access-mrgw2\") pod \"auto-csr-approver-29537518-7v29k\" (UID: \"115b12b9-5377-4f22-b13b-a5aaf42dd570\") " pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.390791 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrgw2\" (UniqueName: \"kubernetes.io/projected/115b12b9-5377-4f22-b13b-a5aaf42dd570-kube-api-access-mrgw2\") pod \"auto-csr-approver-29537518-7v29k\" (UID: \"115b12b9-5377-4f22-b13b-a5aaf42dd570\") " pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.415349 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrgw2\" (UniqueName: \"kubernetes.io/projected/115b12b9-5377-4f22-b13b-a5aaf42dd570-kube-api-access-mrgw2\") pod \"auto-csr-approver-29537518-7v29k\" (UID: \"115b12b9-5377-4f22-b13b-a5aaf42dd570\") " pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.501731 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:00 crc kubenswrapper[4819]: I0228 03:58:00.779073 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537518-7v29k"] Feb 28 03:58:00 crc kubenswrapper[4819]: W0228 03:58:00.790920 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod115b12b9_5377_4f22_b13b_a5aaf42dd570.slice/crio-7dae821e77defa32ab3c82c3394742dc3725c18b816eee19e5bf3261e4937b11 WatchSource:0}: Error finding container 7dae821e77defa32ab3c82c3394742dc3725c18b816eee19e5bf3261e4937b11: Status 404 returned error can't find the container with id 7dae821e77defa32ab3c82c3394742dc3725c18b816eee19e5bf3261e4937b11 Feb 28 03:58:01 crc kubenswrapper[4819]: I0228 03:58:01.094989 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537518-7v29k" event={"ID":"115b12b9-5377-4f22-b13b-a5aaf42dd570","Type":"ContainerStarted","Data":"7dae821e77defa32ab3c82c3394742dc3725c18b816eee19e5bf3261e4937b11"} Feb 28 03:58:03 crc kubenswrapper[4819]: I0228 03:58:03.110637 4819 generic.go:334] "Generic (PLEG): container finished" podID="115b12b9-5377-4f22-b13b-a5aaf42dd570" containerID="5c4a18c64163ab0cf963cbaee6e85eeff2525909ce1699af73c57ec1a9508648" exitCode=0 Feb 28 03:58:03 crc kubenswrapper[4819]: I0228 03:58:03.111341 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537518-7v29k" event={"ID":"115b12b9-5377-4f22-b13b-a5aaf42dd570","Type":"ContainerDied","Data":"5c4a18c64163ab0cf963cbaee6e85eeff2525909ce1699af73c57ec1a9508648"} Feb 28 03:58:04 crc kubenswrapper[4819]: I0228 03:58:04.432084 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:04 crc kubenswrapper[4819]: I0228 03:58:04.553449 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrgw2\" (UniqueName: \"kubernetes.io/projected/115b12b9-5377-4f22-b13b-a5aaf42dd570-kube-api-access-mrgw2\") pod \"115b12b9-5377-4f22-b13b-a5aaf42dd570\" (UID: \"115b12b9-5377-4f22-b13b-a5aaf42dd570\") " Feb 28 03:58:04 crc kubenswrapper[4819]: I0228 03:58:04.574994 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/115b12b9-5377-4f22-b13b-a5aaf42dd570-kube-api-access-mrgw2" (OuterVolumeSpecName: "kube-api-access-mrgw2") pod "115b12b9-5377-4f22-b13b-a5aaf42dd570" (UID: "115b12b9-5377-4f22-b13b-a5aaf42dd570"). InnerVolumeSpecName "kube-api-access-mrgw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:58:04 crc kubenswrapper[4819]: I0228 03:58:04.654694 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrgw2\" (UniqueName: \"kubernetes.io/projected/115b12b9-5377-4f22-b13b-a5aaf42dd570-kube-api-access-mrgw2\") on node \"crc\" DevicePath \"\"" Feb 28 03:58:05 crc kubenswrapper[4819]: I0228 03:58:05.136392 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537518-7v29k" event={"ID":"115b12b9-5377-4f22-b13b-a5aaf42dd570","Type":"ContainerDied","Data":"7dae821e77defa32ab3c82c3394742dc3725c18b816eee19e5bf3261e4937b11"} Feb 28 03:58:05 crc kubenswrapper[4819]: I0228 03:58:05.137093 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dae821e77defa32ab3c82c3394742dc3725c18b816eee19e5bf3261e4937b11" Feb 28 03:58:05 crc kubenswrapper[4819]: I0228 03:58:05.136502 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537518-7v29k" Feb 28 03:58:05 crc kubenswrapper[4819]: I0228 03:58:05.526312 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537512-hwm9s"] Feb 28 03:58:05 crc kubenswrapper[4819]: I0228 03:58:05.529143 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537512-hwm9s"] Feb 28 03:58:06 crc kubenswrapper[4819]: I0228 03:58:06.384270 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7" path="/var/lib/kubelet/pods/288a4e97-a8b4-4ba9-b55a-e69f14c6a7c7/volumes" Feb 28 03:58:31 crc kubenswrapper[4819]: I0228 03:58:31.373506 4819 generic.go:334] "Generic (PLEG): container finished" podID="6cccb0a9-d379-4387-af64-270369275bc6" containerID="cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2" exitCode=0 Feb 28 03:58:31 crc kubenswrapper[4819]: I0228 03:58:31.373624 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hsmw/must-gather-wtqph" event={"ID":"6cccb0a9-d379-4387-af64-270369275bc6","Type":"ContainerDied","Data":"cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2"} Feb 28 03:58:31 crc kubenswrapper[4819]: I0228 03:58:31.374561 4819 scope.go:117] "RemoveContainer" containerID="cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2" Feb 28 03:58:32 crc kubenswrapper[4819]: I0228 03:58:32.309563 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7hsmw_must-gather-wtqph_6cccb0a9-d379-4387-af64-270369275bc6/gather/0.log" Feb 28 03:58:39 crc kubenswrapper[4819]: I0228 03:58:39.599401 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7hsmw/must-gather-wtqph"] Feb 28 03:58:39 crc kubenswrapper[4819]: I0228 03:58:39.600187 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7hsmw/must-gather-wtqph" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="copy" containerID="cri-o://d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53" gracePeriod=2 Feb 28 03:58:39 crc kubenswrapper[4819]: I0228 03:58:39.607062 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7hsmw/must-gather-wtqph"] Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.017365 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7hsmw_must-gather-wtqph_6cccb0a9-d379-4387-af64-270369275bc6/copy/0.log" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.018011 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.048096 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cccb0a9-d379-4387-af64-270369275bc6-must-gather-output\") pod \"6cccb0a9-d379-4387-af64-270369275bc6\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.048140 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-826mn\" (UniqueName: \"kubernetes.io/projected/6cccb0a9-d379-4387-af64-270369275bc6-kube-api-access-826mn\") pod \"6cccb0a9-d379-4387-af64-270369275bc6\" (UID: \"6cccb0a9-d379-4387-af64-270369275bc6\") " Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.065946 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cccb0a9-d379-4387-af64-270369275bc6-kube-api-access-826mn" (OuterVolumeSpecName: "kube-api-access-826mn") pod "6cccb0a9-d379-4387-af64-270369275bc6" (UID: "6cccb0a9-d379-4387-af64-270369275bc6"). InnerVolumeSpecName "kube-api-access-826mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.108009 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cccb0a9-d379-4387-af64-270369275bc6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6cccb0a9-d379-4387-af64-270369275bc6" (UID: "6cccb0a9-d379-4387-af64-270369275bc6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.149972 4819 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cccb0a9-d379-4387-af64-270369275bc6-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.150021 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-826mn\" (UniqueName: \"kubernetes.io/projected/6cccb0a9-d379-4387-af64-270369275bc6-kube-api-access-826mn\") on node \"crc\" DevicePath \"\"" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.380971 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cccb0a9-d379-4387-af64-270369275bc6" path="/var/lib/kubelet/pods/6cccb0a9-d379-4387-af64-270369275bc6/volumes" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.442867 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7hsmw_must-gather-wtqph_6cccb0a9-d379-4387-af64-270369275bc6/copy/0.log" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.443704 4819 generic.go:334] "Generic (PLEG): container finished" podID="6cccb0a9-d379-4387-af64-270369275bc6" containerID="d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53" exitCode=143 Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.443761 4819 scope.go:117] "RemoveContainer" containerID="d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.443908 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hsmw/must-gather-wtqph" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.485182 4819 scope.go:117] "RemoveContainer" containerID="cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.546959 4819 scope.go:117] "RemoveContainer" containerID="d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53" Feb 28 03:58:40 crc kubenswrapper[4819]: E0228 03:58:40.547567 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53\": container with ID starting with d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53 not found: ID does not exist" containerID="d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.547622 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53"} err="failed to get container status \"d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53\": rpc error: code = NotFound desc = could not find container \"d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53\": container with ID starting with d02e5d80d6882655459a74e2bc9629845850cd6380f59728c1daa7db8c8b9f53 not found: ID does not exist" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.547653 4819 scope.go:117] "RemoveContainer" containerID="cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2" Feb 28 03:58:40 crc kubenswrapper[4819]: E0228 03:58:40.548132 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2\": container with ID starting with cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2 not found: ID does not exist" containerID="cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2" Feb 28 03:58:40 crc kubenswrapper[4819]: I0228 03:58:40.548172 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2"} err="failed to get container status \"cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2\": rpc error: code = NotFound desc = could not find container \"cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2\": container with ID starting with cd4867a2363f4e95b3d52b051f5a676c8e6ef3b16004aa9c6144cba9de46ffc2 not found: ID does not exist" Feb 28 03:58:51 crc kubenswrapper[4819]: I0228 03:58:51.139940 4819 scope.go:117] "RemoveContainer" containerID="0f483024b42efa5b30db139b11ffd97e6fe5923f275055315398edfe1490b914" Feb 28 03:58:51 crc kubenswrapper[4819]: I0228 03:58:51.196128 4819 scope.go:117] "RemoveContainer" containerID="495028479d72ccade35bdc2249ad245f58ef913aeae108a2ae175ab0f2e65e03" Feb 28 03:58:51 crc kubenswrapper[4819]: I0228 03:58:51.227151 4819 scope.go:117] "RemoveContainer" containerID="0c40aeb40f2ea24011ca9b9c8b48dab39bd8f56463c0d5dd1ec35a265f08cf2b" Feb 28 03:58:51 crc kubenswrapper[4819]: I0228 03:58:51.268724 4819 scope.go:117] "RemoveContainer" containerID="4ab273dfbc4bb8a4746e8c840d380434faf5007db662fe2d274f05bf7c85cd42" Feb 28 03:58:51 crc kubenswrapper[4819]: I0228 03:58:51.283267 4819 scope.go:117] "RemoveContainer" containerID="771928f415d984c36b7e15323907df7846b3812dfbcfb52e97202445b058d1b9" Feb 28 03:58:51 crc kubenswrapper[4819]: I0228 03:58:51.307863 4819 scope.go:117] "RemoveContainer" containerID="0ab3c436d4fb3a495be539e08073c5ba842461b6c84aedb5c2e399b55f58d227" Feb 28 03:59:30 crc kubenswrapper[4819]: I0228 03:59:30.834205 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 03:59:30 crc kubenswrapper[4819]: I0228 03:59:30.834975 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.390907 4819 scope.go:117] "RemoveContainer" containerID="e946fea03afa6470a8097019c351aa8266932a3e0ece422234188c8f8e4a7bbc" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.419511 4819 scope.go:117] "RemoveContainer" containerID="951482b6a05666a3b0d20aaba57776494dceb97af16580d34ac6a390ef29b346" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.438866 4819 scope.go:117] "RemoveContainer" containerID="37b4c5862d7467856fa9d6a22198a9df342f05267729e82021b04d502c72f8c5" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.459941 4819 scope.go:117] "RemoveContainer" containerID="455c397d636246c1d36a80721a8b9986b3fec5c926f9cb5a5a8b6b20270cd096" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.503635 4819 scope.go:117] "RemoveContainer" containerID="24a30d737b8b0ab6e5d840797bc62d550feab4bc0c06489fd4f821811a692eae" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.525600 4819 scope.go:117] "RemoveContainer" containerID="389b6726bf5f90963538c33367844b38e39c3836d02eed00c327a21d4aeb46d8" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.546477 4819 scope.go:117] "RemoveContainer" containerID="f71cb2629aa563d677a945ce7654c5438c5fa8a6e777d0f318dfda79eaef8269" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.562474 4819 scope.go:117] "RemoveContainer" containerID="c7a251fbf38750e5bf0f47cd64dc4b60946989fe9dd894b679233003028060c7" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.582235 4819 scope.go:117] "RemoveContainer" containerID="3e331c43edcbe05ce304e52e88d7c8531de494cb84650dbf542ea8f661de5b21" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.600613 4819 scope.go:117] "RemoveContainer" containerID="ca4e90c95af3abe85a0867527182d2e377f6488de6fb00113a1103be905a49d4" Feb 28 03:59:51 crc kubenswrapper[4819]: I0228 03:59:51.619933 4819 scope.go:117] "RemoveContainer" containerID="8a49fc8fafc6a1f43a727f2674b36df16257d18309f28060a024ea0cc865b1ce" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.158005 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537520-bxcj7"] Feb 28 04:00:00 crc kubenswrapper[4819]: E0228 04:00:00.158872 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="copy" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.158887 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="copy" Feb 28 04:00:00 crc kubenswrapper[4819]: E0228 04:00:00.158898 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115b12b9-5377-4f22-b13b-a5aaf42dd570" containerName="oc" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.158908 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="115b12b9-5377-4f22-b13b-a5aaf42dd570" containerName="oc" Feb 28 04:00:00 crc kubenswrapper[4819]: E0228 04:00:00.158926 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="gather" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.158934 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="gather" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.159045 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="115b12b9-5377-4f22-b13b-a5aaf42dd570" containerName="oc" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.159056 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="gather" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.159071 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cccb0a9-d379-4387-af64-270369275bc6" containerName="copy" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.159562 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.161566 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.161931 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.164110 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.169062 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp"] Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.170345 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.172108 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.172703 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.177694 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537520-bxcj7"] Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.184283 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30983696-36f0-46ec-ac27-0f12ffd4b4c0-secret-volume\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.184329 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7nw\" (UniqueName: \"kubernetes.io/projected/c03ac5ac-3d6d-446a-97c7-a5012a033c71-kube-api-access-2p7nw\") pod \"auto-csr-approver-29537520-bxcj7\" (UID: \"c03ac5ac-3d6d-446a-97c7-a5012a033c71\") " pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.184362 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30983696-36f0-46ec-ac27-0f12ffd4b4c0-config-volume\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.184447 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtzh\" (UniqueName: \"kubernetes.io/projected/30983696-36f0-46ec-ac27-0f12ffd4b4c0-kube-api-access-6jtzh\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.185210 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp"] Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.285199 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtzh\" (UniqueName: \"kubernetes.io/projected/30983696-36f0-46ec-ac27-0f12ffd4b4c0-kube-api-access-6jtzh\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.285312 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30983696-36f0-46ec-ac27-0f12ffd4b4c0-secret-volume\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.285347 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7nw\" (UniqueName: \"kubernetes.io/projected/c03ac5ac-3d6d-446a-97c7-a5012a033c71-kube-api-access-2p7nw\") pod \"auto-csr-approver-29537520-bxcj7\" (UID: \"c03ac5ac-3d6d-446a-97c7-a5012a033c71\") " pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.285390 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30983696-36f0-46ec-ac27-0f12ffd4b4c0-config-volume\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.286811 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30983696-36f0-46ec-ac27-0f12ffd4b4c0-config-volume\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.297920 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30983696-36f0-46ec-ac27-0f12ffd4b4c0-secret-volume\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.303748 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtzh\" (UniqueName: \"kubernetes.io/projected/30983696-36f0-46ec-ac27-0f12ffd4b4c0-kube-api-access-6jtzh\") pod \"collect-profiles-29537520-jsthp\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.307185 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7nw\" (UniqueName: \"kubernetes.io/projected/c03ac5ac-3d6d-446a-97c7-a5012a033c71-kube-api-access-2p7nw\") pod \"auto-csr-approver-29537520-bxcj7\" (UID: \"c03ac5ac-3d6d-446a-97c7-a5012a033c71\") " pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.521863 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.543556 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.794568 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537520-bxcj7"] Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.822173 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp"] Feb 28 04:00:00 crc kubenswrapper[4819]: W0228 04:00:00.823630 4819 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30983696_36f0_46ec_ac27_0f12ffd4b4c0.slice/crio-893b3f5dc2ec3e80bfe9c0889204774e34beb911f71f333d661f2152600a0de2 WatchSource:0}: Error finding container 893b3f5dc2ec3e80bfe9c0889204774e34beb911f71f333d661f2152600a0de2: Status 404 returned error can't find the container with id 893b3f5dc2ec3e80bfe9c0889204774e34beb911f71f333d661f2152600a0de2 Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.834484 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:00:00 crc kubenswrapper[4819]: I0228 04:00:00.834538 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:00:01 crc kubenswrapper[4819]: I0228 04:00:01.094096 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" event={"ID":"30983696-36f0-46ec-ac27-0f12ffd4b4c0","Type":"ContainerStarted","Data":"7afc9455c509619bba398610badf82901bdab5ac95fe91b5b8ec75bf1391daaa"} Feb 28 04:00:01 crc kubenswrapper[4819]: I0228 04:00:01.094160 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" event={"ID":"30983696-36f0-46ec-ac27-0f12ffd4b4c0","Type":"ContainerStarted","Data":"893b3f5dc2ec3e80bfe9c0889204774e34beb911f71f333d661f2152600a0de2"} Feb 28 04:00:01 crc kubenswrapper[4819]: I0228 04:00:01.095634 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" event={"ID":"c03ac5ac-3d6d-446a-97c7-a5012a033c71","Type":"ContainerStarted","Data":"7e41725c52546041107afbd45ef3b1777a80e675e039ec16e609f5a5750e88f0"} Feb 28 04:00:01 crc kubenswrapper[4819]: I0228 04:00:01.116628 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" podStartSLOduration=1.116612529 podStartE2EDuration="1.116612529s" podCreationTimestamp="2026-02-28 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:00:01.110320464 +0000 UTC m=+1539.575889362" watchObservedRunningTime="2026-02-28 04:00:01.116612529 +0000 UTC m=+1539.582181387" Feb 28 04:00:02 crc kubenswrapper[4819]: I0228 04:00:02.107203 4819 generic.go:334] "Generic (PLEG): container finished" podID="30983696-36f0-46ec-ac27-0f12ffd4b4c0" containerID="7afc9455c509619bba398610badf82901bdab5ac95fe91b5b8ec75bf1391daaa" exitCode=0 Feb 28 04:00:02 crc kubenswrapper[4819]: I0228 04:00:02.107303 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" event={"ID":"30983696-36f0-46ec-ac27-0f12ffd4b4c0","Type":"ContainerDied","Data":"7afc9455c509619bba398610badf82901bdab5ac95fe91b5b8ec75bf1391daaa"} Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.115058 4819 generic.go:334] "Generic (PLEG): container finished" podID="c03ac5ac-3d6d-446a-97c7-a5012a033c71" containerID="35b6f53f4f976b9ab4aeefee60f884944c9d8b2149aa679163d0550e580c4f86" exitCode=0 Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.115159 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" event={"ID":"c03ac5ac-3d6d-446a-97c7-a5012a033c71","Type":"ContainerDied","Data":"35b6f53f4f976b9ab4aeefee60f884944c9d8b2149aa679163d0550e580c4f86"} Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.362657 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.431477 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30983696-36f0-46ec-ac27-0f12ffd4b4c0-config-volume\") pod \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.431531 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30983696-36f0-46ec-ac27-0f12ffd4b4c0-secret-volume\") pod \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.431557 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jtzh\" (UniqueName: \"kubernetes.io/projected/30983696-36f0-46ec-ac27-0f12ffd4b4c0-kube-api-access-6jtzh\") pod \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\" (UID: \"30983696-36f0-46ec-ac27-0f12ffd4b4c0\") " Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.433442 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30983696-36f0-46ec-ac27-0f12ffd4b4c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "30983696-36f0-46ec-ac27-0f12ffd4b4c0" (UID: "30983696-36f0-46ec-ac27-0f12ffd4b4c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.440676 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30983696-36f0-46ec-ac27-0f12ffd4b4c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "30983696-36f0-46ec-ac27-0f12ffd4b4c0" (UID: "30983696-36f0-46ec-ac27-0f12ffd4b4c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.440763 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30983696-36f0-46ec-ac27-0f12ffd4b4c0-kube-api-access-6jtzh" (OuterVolumeSpecName: "kube-api-access-6jtzh") pod "30983696-36f0-46ec-ac27-0f12ffd4b4c0" (UID: "30983696-36f0-46ec-ac27-0f12ffd4b4c0"). InnerVolumeSpecName "kube-api-access-6jtzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.533805 4819 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30983696-36f0-46ec-ac27-0f12ffd4b4c0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.534078 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jtzh\" (UniqueName: \"kubernetes.io/projected/30983696-36f0-46ec-ac27-0f12ffd4b4c0-kube-api-access-6jtzh\") on node \"crc\" DevicePath \"\"" Feb 28 04:00:03 crc kubenswrapper[4819]: I0228 04:00:03.534223 4819 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30983696-36f0-46ec-ac27-0f12ffd4b4c0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.124827 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" event={"ID":"30983696-36f0-46ec-ac27-0f12ffd4b4c0","Type":"ContainerDied","Data":"893b3f5dc2ec3e80bfe9c0889204774e34beb911f71f333d661f2152600a0de2"} Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.124857 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29537520-jsthp" Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.124885 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="893b3f5dc2ec3e80bfe9c0889204774e34beb911f71f333d661f2152600a0de2" Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.380744 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.446459 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p7nw\" (UniqueName: \"kubernetes.io/projected/c03ac5ac-3d6d-446a-97c7-a5012a033c71-kube-api-access-2p7nw\") pod \"c03ac5ac-3d6d-446a-97c7-a5012a033c71\" (UID: \"c03ac5ac-3d6d-446a-97c7-a5012a033c71\") " Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.451424 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ac5ac-3d6d-446a-97c7-a5012a033c71-kube-api-access-2p7nw" (OuterVolumeSpecName: "kube-api-access-2p7nw") pod "c03ac5ac-3d6d-446a-97c7-a5012a033c71" (UID: "c03ac5ac-3d6d-446a-97c7-a5012a033c71"). InnerVolumeSpecName "kube-api-access-2p7nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:00:04 crc kubenswrapper[4819]: I0228 04:00:04.548568 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p7nw\" (UniqueName: \"kubernetes.io/projected/c03ac5ac-3d6d-446a-97c7-a5012a033c71-kube-api-access-2p7nw\") on node \"crc\" DevicePath \"\"" Feb 28 04:00:05 crc kubenswrapper[4819]: I0228 04:00:05.132237 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" event={"ID":"c03ac5ac-3d6d-446a-97c7-a5012a033c71","Type":"ContainerDied","Data":"7e41725c52546041107afbd45ef3b1777a80e675e039ec16e609f5a5750e88f0"} Feb 28 04:00:05 crc kubenswrapper[4819]: I0228 04:00:05.132293 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e41725c52546041107afbd45ef3b1777a80e675e039ec16e609f5a5750e88f0" Feb 28 04:00:05 crc kubenswrapper[4819]: I0228 04:00:05.132367 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537520-bxcj7" Feb 28 04:00:05 crc kubenswrapper[4819]: I0228 04:00:05.463583 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537514-gx2vz"] Feb 28 04:00:05 crc kubenswrapper[4819]: I0228 04:00:05.471986 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537514-gx2vz"] Feb 28 04:00:06 crc kubenswrapper[4819]: I0228 04:00:06.376561 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b9d23e-ecf4-482b-ac4f-6c7585b30b54" path="/var/lib/kubelet/pods/26b9d23e-ecf4-482b-ac4f-6c7585b30b54/volumes" Feb 28 04:00:30 crc kubenswrapper[4819]: I0228 04:00:30.834614 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:00:30 crc kubenswrapper[4819]: I0228 04:00:30.835353 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:00:30 crc kubenswrapper[4819]: I0228 04:00:30.835420 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 04:00:30 crc kubenswrapper[4819]: I0228 04:00:30.836231 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a6196e1947fe2ebe77dd10c343260071400941456b6e5fa2ac8b053fa27f275"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:00:30 crc kubenswrapper[4819]: I0228 04:00:30.836355 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://9a6196e1947fe2ebe77dd10c343260071400941456b6e5fa2ac8b053fa27f275" gracePeriod=600 Feb 28 04:00:31 crc kubenswrapper[4819]: I0228 04:00:31.346023 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="9a6196e1947fe2ebe77dd10c343260071400941456b6e5fa2ac8b053fa27f275" exitCode=0 Feb 28 04:00:31 crc kubenswrapper[4819]: I0228 04:00:31.346071 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"9a6196e1947fe2ebe77dd10c343260071400941456b6e5fa2ac8b053fa27f275"} Feb 28 04:00:31 crc kubenswrapper[4819]: I0228 04:00:31.346386 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerStarted","Data":"3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20"} Feb 28 04:00:31 crc kubenswrapper[4819]: I0228 04:00:31.346411 4819 scope.go:117] "RemoveContainer" containerID="9d369a9c21ccdd5b3db603da688e0c28628885c9c52c044661ee7b6146a29101" Feb 28 04:00:51 crc kubenswrapper[4819]: I0228 04:00:51.751484 4819 scope.go:117] "RemoveContainer" containerID="b886ff7654199bfd530b5f29bcd25710ace87e00b445783c0805ed96066e290a" Feb 28 04:00:51 crc kubenswrapper[4819]: I0228 04:00:51.804527 4819 scope.go:117] "RemoveContainer" containerID="464d5583c42f5212a4c750a4e033617e75baf588a143582bfc7096ea9941af2e" Feb 28 04:00:51 crc kubenswrapper[4819]: I0228 04:00:51.829062 4819 scope.go:117] "RemoveContainer" containerID="e817a969209f281910e85697a63def6e468220d0e416fafb41e938a5c42100da" Feb 28 04:00:51 crc kubenswrapper[4819]: I0228 04:00:51.858077 4819 scope.go:117] "RemoveContainer" containerID="b616418b8ed847a1deb51307c4e891033ecfc2ecf69f98056fb96dc67e3b5de4" Feb 28 04:00:51 crc kubenswrapper[4819]: I0228 04:00:51.873906 4819 scope.go:117] "RemoveContainer" containerID="45670fb02aacf1f22e85015f74089337d3b67b60a37dfd9dc68b4e9d9beba427" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.422708 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6jrnw/must-gather-sph8h"] Feb 28 04:01:02 crc kubenswrapper[4819]: E0228 04:01:02.423407 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30983696-36f0-46ec-ac27-0f12ffd4b4c0" containerName="collect-profiles" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.423427 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="30983696-36f0-46ec-ac27-0f12ffd4b4c0" containerName="collect-profiles" Feb 28 04:01:02 crc kubenswrapper[4819]: E0228 04:01:02.423459 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03ac5ac-3d6d-446a-97c7-a5012a033c71" containerName="oc" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.423470 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03ac5ac-3d6d-446a-97c7-a5012a033c71" containerName="oc" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.423650 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="30983696-36f0-46ec-ac27-0f12ffd4b4c0" containerName="collect-profiles" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.423672 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03ac5ac-3d6d-446a-97c7-a5012a033c71" containerName="oc" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.424643 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.427508 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6jrnw"/"kube-root-ca.crt" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.434638 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6jrnw"/"openshift-service-ca.crt" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.441812 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6jrnw"/"default-dockercfg-6qmrz" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.472613 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6jrnw/must-gather-sph8h"] Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.581428 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drqc\" (UniqueName: \"kubernetes.io/projected/8f0a6237-c182-4355-9060-801c74dbe662-kube-api-access-6drqc\") pod \"must-gather-sph8h\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.581484 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f0a6237-c182-4355-9060-801c74dbe662-must-gather-output\") pod \"must-gather-sph8h\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.682861 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drqc\" (UniqueName: \"kubernetes.io/projected/8f0a6237-c182-4355-9060-801c74dbe662-kube-api-access-6drqc\") pod \"must-gather-sph8h\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.682937 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f0a6237-c182-4355-9060-801c74dbe662-must-gather-output\") pod \"must-gather-sph8h\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.683441 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f0a6237-c182-4355-9060-801c74dbe662-must-gather-output\") pod \"must-gather-sph8h\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.712844 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drqc\" (UniqueName: \"kubernetes.io/projected/8f0a6237-c182-4355-9060-801c74dbe662-kube-api-access-6drqc\") pod \"must-gather-sph8h\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:02 crc kubenswrapper[4819]: I0228 04:01:02.741787 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:01:03 crc kubenswrapper[4819]: I0228 04:01:03.176312 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6jrnw/must-gather-sph8h"] Feb 28 04:01:03 crc kubenswrapper[4819]: I0228 04:01:03.608416 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jrnw/must-gather-sph8h" event={"ID":"8f0a6237-c182-4355-9060-801c74dbe662","Type":"ContainerStarted","Data":"b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2"} Feb 28 04:01:03 crc kubenswrapper[4819]: I0228 04:01:03.608752 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jrnw/must-gather-sph8h" event={"ID":"8f0a6237-c182-4355-9060-801c74dbe662","Type":"ContainerStarted","Data":"c0e520b6579f2b629ce292fad9a4675608f5800440860102847995248fe45644"} Feb 28 04:01:04 crc kubenswrapper[4819]: I0228 04:01:04.615536 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jrnw/must-gather-sph8h" event={"ID":"8f0a6237-c182-4355-9060-801c74dbe662","Type":"ContainerStarted","Data":"d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae"} Feb 28 04:01:04 crc kubenswrapper[4819]: I0228 04:01:04.642311 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6jrnw/must-gather-sph8h" podStartSLOduration=2.6422351539999998 podStartE2EDuration="2.642235154s" podCreationTimestamp="2026-02-28 04:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-28 04:01:04.636143934 +0000 UTC m=+1603.101712832" watchObservedRunningTime="2026-02-28 04:01:04.642235154 +0000 UTC m=+1603.107804042" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.129372 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-24jrq"] Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.131042 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.141129 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-24jrq"] Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.186753 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-utilities\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.186838 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc4hj\" (UniqueName: \"kubernetes.io/projected/5c425b76-a614-4286-a66e-3f3e30309cbc-kube-api-access-jc4hj\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.186901 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-catalog-content\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.288596 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc4hj\" (UniqueName: \"kubernetes.io/projected/5c425b76-a614-4286-a66e-3f3e30309cbc-kube-api-access-jc4hj\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.288679 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-catalog-content\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.288709 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-utilities\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.289144 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-utilities\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.289671 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-catalog-content\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.309153 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc4hj\" (UniqueName: \"kubernetes.io/projected/5c425b76-a614-4286-a66e-3f3e30309cbc-kube-api-access-jc4hj\") pod \"certified-operators-24jrq\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.446336 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.701373 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-24jrq"] Feb 28 04:01:37 crc kubenswrapper[4819]: I0228 04:01:37.859803 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerStarted","Data":"5c87c6a0ab4e3a117570d9a48b3c90421f32e99175a59e2ee89903a410dfe725"} Feb 28 04:01:38 crc kubenswrapper[4819]: I0228 04:01:38.870601 4819 generic.go:334] "Generic (PLEG): container finished" podID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerID="f19be57d6e76bf45ddc37821fc6c2197b14ad96a8f188d8ee4b1e526605dc133" exitCode=0 Feb 28 04:01:38 crc kubenswrapper[4819]: I0228 04:01:38.870668 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerDied","Data":"f19be57d6e76bf45ddc37821fc6c2197b14ad96a8f188d8ee4b1e526605dc133"} Feb 28 04:01:38 crc kubenswrapper[4819]: I0228 04:01:38.873611 4819 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 28 04:01:39 crc kubenswrapper[4819]: I0228 04:01:39.880788 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerStarted","Data":"9f825cf15e88aab48fefe9649a6ae8eb675d61c920cb78fd8fbb40d7cb6d5a83"} Feb 28 04:01:40 crc kubenswrapper[4819]: I0228 04:01:40.889653 4819 generic.go:334] "Generic (PLEG): container finished" podID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerID="9f825cf15e88aab48fefe9649a6ae8eb675d61c920cb78fd8fbb40d7cb6d5a83" exitCode=0 Feb 28 04:01:40 crc kubenswrapper[4819]: I0228 04:01:40.889765 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerDied","Data":"9f825cf15e88aab48fefe9649a6ae8eb675d61c920cb78fd8fbb40d7cb6d5a83"} Feb 28 04:01:41 crc kubenswrapper[4819]: I0228 04:01:41.896398 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerStarted","Data":"fe547b400a89209e267a5716c3c6fba3f819e52355580b41efe642f3c7bf7ef2"} Feb 28 04:01:41 crc kubenswrapper[4819]: I0228 04:01:41.919790 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-24jrq" podStartSLOduration=2.441983062 podStartE2EDuration="4.919770489s" podCreationTimestamp="2026-02-28 04:01:37 +0000 UTC" firstStartedPulling="2026-02-28 04:01:38.873197498 +0000 UTC m=+1637.338766396" lastFinishedPulling="2026-02-28 04:01:41.350984915 +0000 UTC m=+1639.816553823" observedRunningTime="2026-02-28 04:01:41.915788312 +0000 UTC m=+1640.381357180" watchObservedRunningTime="2026-02-28 04:01:41.919770489 +0000 UTC m=+1640.385339347" Feb 28 04:01:47 crc kubenswrapper[4819]: I0228 04:01:47.446897 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:47 crc kubenswrapper[4819]: I0228 04:01:47.447384 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:47 crc kubenswrapper[4819]: I0228 04:01:47.493333 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:47 crc kubenswrapper[4819]: I0228 04:01:47.985608 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:48 crc kubenswrapper[4819]: I0228 04:01:48.036806 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-24jrq"] Feb 28 04:01:49 crc kubenswrapper[4819]: I0228 04:01:49.945683 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-24jrq" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="registry-server" containerID="cri-o://fe547b400a89209e267a5716c3c6fba3f819e52355580b41efe642f3c7bf7ef2" gracePeriod=2 Feb 28 04:01:50 crc kubenswrapper[4819]: I0228 04:01:50.954172 4819 generic.go:334] "Generic (PLEG): container finished" podID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerID="fe547b400a89209e267a5716c3c6fba3f819e52355580b41efe642f3c7bf7ef2" exitCode=0 Feb 28 04:01:50 crc kubenswrapper[4819]: I0228 04:01:50.954260 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerDied","Data":"fe547b400a89209e267a5716c3c6fba3f819e52355580b41efe642f3c7bf7ef2"} Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.260092 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.384531 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc4hj\" (UniqueName: \"kubernetes.io/projected/5c425b76-a614-4286-a66e-3f3e30309cbc-kube-api-access-jc4hj\") pod \"5c425b76-a614-4286-a66e-3f3e30309cbc\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.384621 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-utilities\") pod \"5c425b76-a614-4286-a66e-3f3e30309cbc\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.384669 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-catalog-content\") pod \"5c425b76-a614-4286-a66e-3f3e30309cbc\" (UID: \"5c425b76-a614-4286-a66e-3f3e30309cbc\") " Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.386270 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-utilities" (OuterVolumeSpecName: "utilities") pod "5c425b76-a614-4286-a66e-3f3e30309cbc" (UID: "5c425b76-a614-4286-a66e-3f3e30309cbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.389663 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c425b76-a614-4286-a66e-3f3e30309cbc-kube-api-access-jc4hj" (OuterVolumeSpecName: "kube-api-access-jc4hj") pod "5c425b76-a614-4286-a66e-3f3e30309cbc" (UID: "5c425b76-a614-4286-a66e-3f3e30309cbc"). InnerVolumeSpecName "kube-api-access-jc4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.486549 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc4hj\" (UniqueName: \"kubernetes.io/projected/5c425b76-a614-4286-a66e-3f3e30309cbc-kube-api-access-jc4hj\") on node \"crc\" DevicePath \"\"" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.486598 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.799895 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c425b76-a614-4286-a66e-3f3e30309cbc" (UID: "5c425b76-a614-4286-a66e-3f3e30309cbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.814064 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c425b76-a614-4286-a66e-3f3e30309cbc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.962360 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24jrq" event={"ID":"5c425b76-a614-4286-a66e-3f3e30309cbc","Type":"ContainerDied","Data":"5c87c6a0ab4e3a117570d9a48b3c90421f32e99175a59e2ee89903a410dfe725"} Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.962444 4819 scope.go:117] "RemoveContainer" containerID="fe547b400a89209e267a5716c3c6fba3f819e52355580b41efe642f3c7bf7ef2" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.962462 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24jrq" Feb 28 04:01:51 crc kubenswrapper[4819]: I0228 04:01:51.981527 4819 scope.go:117] "RemoveContainer" containerID="9f825cf15e88aab48fefe9649a6ae8eb675d61c920cb78fd8fbb40d7cb6d5a83" Feb 28 04:01:52 crc kubenswrapper[4819]: I0228 04:01:52.008900 4819 scope.go:117] "RemoveContainer" containerID="f19be57d6e76bf45ddc37821fc6c2197b14ad96a8f188d8ee4b1e526605dc133" Feb 28 04:01:52 crc kubenswrapper[4819]: I0228 04:01:52.010117 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-24jrq"] Feb 28 04:01:52 crc kubenswrapper[4819]: I0228 04:01:52.017474 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-24jrq"] Feb 28 04:01:52 crc kubenswrapper[4819]: I0228 04:01:52.384300 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" path="/var/lib/kubelet/pods/5c425b76-a614-4286-a66e-3f3e30309cbc/volumes" Feb 28 04:01:57 crc kubenswrapper[4819]: I0228 04:01:57.234769 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qv8xl_5e568a4f-33f8-447b-840c-dc560774878d/control-plane-machine-set-operator/0.log" Feb 28 04:01:57 crc kubenswrapper[4819]: I0228 04:01:57.348791 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tdcjf_b6a78496-9606-4297-b022-286a969e9ea6/kube-rbac-proxy/0.log" Feb 28 04:01:57 crc kubenswrapper[4819]: I0228 04:01:57.369704 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tdcjf_b6a78496-9606-4297-b022-286a969e9ea6/machine-api-operator/0.log" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.138449 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537522-x85db"] Feb 28 04:02:00 crc kubenswrapper[4819]: E0228 04:02:00.139041 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="extract-utilities" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.139058 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="extract-utilities" Feb 28 04:02:00 crc kubenswrapper[4819]: E0228 04:02:00.139086 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="registry-server" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.139095 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="registry-server" Feb 28 04:02:00 crc kubenswrapper[4819]: E0228 04:02:00.139109 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="extract-content" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.139118 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="extract-content" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.139271 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c425b76-a614-4286-a66e-3f3e30309cbc" containerName="registry-server" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.139726 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.142979 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.143070 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.143716 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.154454 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537522-x85db"] Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.233628 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664cx\" (UniqueName: \"kubernetes.io/projected/014d9911-5e1b-466c-a29f-ab80661525c8-kube-api-access-664cx\") pod \"auto-csr-approver-29537522-x85db\" (UID: \"014d9911-5e1b-466c-a29f-ab80661525c8\") " pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.334828 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-664cx\" (UniqueName: \"kubernetes.io/projected/014d9911-5e1b-466c-a29f-ab80661525c8-kube-api-access-664cx\") pod \"auto-csr-approver-29537522-x85db\" (UID: \"014d9911-5e1b-466c-a29f-ab80661525c8\") " pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.356749 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-664cx\" (UniqueName: \"kubernetes.io/projected/014d9911-5e1b-466c-a29f-ab80661525c8-kube-api-access-664cx\") pod \"auto-csr-approver-29537522-x85db\" (UID: \"014d9911-5e1b-466c-a29f-ab80661525c8\") " pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.471411 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:00 crc kubenswrapper[4819]: I0228 04:02:00.726522 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537522-x85db"] Feb 28 04:02:01 crc kubenswrapper[4819]: I0228 04:02:01.022755 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537522-x85db" event={"ID":"014d9911-5e1b-466c-a29f-ab80661525c8","Type":"ContainerStarted","Data":"b7905745e3a4de32981c27426920689e7877256a285380e8dd26d9fde5abe5a1"} Feb 28 04:02:03 crc kubenswrapper[4819]: I0228 04:02:03.038645 4819 generic.go:334] "Generic (PLEG): container finished" podID="014d9911-5e1b-466c-a29f-ab80661525c8" containerID="26f6cb39e4849cb65f7c21143812ee1fc3eedfb7182ba383f5f6fadcafd4eeb0" exitCode=0 Feb 28 04:02:03 crc kubenswrapper[4819]: I0228 04:02:03.038751 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537522-x85db" event={"ID":"014d9911-5e1b-466c-a29f-ab80661525c8","Type":"ContainerDied","Data":"26f6cb39e4849cb65f7c21143812ee1fc3eedfb7182ba383f5f6fadcafd4eeb0"} Feb 28 04:02:04 crc kubenswrapper[4819]: I0228 04:02:04.320703 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:04 crc kubenswrapper[4819]: I0228 04:02:04.387792 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-664cx\" (UniqueName: \"kubernetes.io/projected/014d9911-5e1b-466c-a29f-ab80661525c8-kube-api-access-664cx\") pod \"014d9911-5e1b-466c-a29f-ab80661525c8\" (UID: \"014d9911-5e1b-466c-a29f-ab80661525c8\") " Feb 28 04:02:04 crc kubenswrapper[4819]: I0228 04:02:04.396550 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014d9911-5e1b-466c-a29f-ab80661525c8-kube-api-access-664cx" (OuterVolumeSpecName: "kube-api-access-664cx") pod "014d9911-5e1b-466c-a29f-ab80661525c8" (UID: "014d9911-5e1b-466c-a29f-ab80661525c8"). InnerVolumeSpecName "kube-api-access-664cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:02:04 crc kubenswrapper[4819]: I0228 04:02:04.489613 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-664cx\" (UniqueName: \"kubernetes.io/projected/014d9911-5e1b-466c-a29f-ab80661525c8-kube-api-access-664cx\") on node \"crc\" DevicePath \"\"" Feb 28 04:02:05 crc kubenswrapper[4819]: I0228 04:02:05.054855 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537522-x85db" event={"ID":"014d9911-5e1b-466c-a29f-ab80661525c8","Type":"ContainerDied","Data":"b7905745e3a4de32981c27426920689e7877256a285380e8dd26d9fde5abe5a1"} Feb 28 04:02:05 crc kubenswrapper[4819]: I0228 04:02:05.055269 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7905745e3a4de32981c27426920689e7877256a285380e8dd26d9fde5abe5a1" Feb 28 04:02:05 crc kubenswrapper[4819]: I0228 04:02:05.054930 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537522-x85db" Feb 28 04:02:05 crc kubenswrapper[4819]: I0228 04:02:05.387583 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537516-rbhzb"] Feb 28 04:02:05 crc kubenswrapper[4819]: I0228 04:02:05.392394 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537516-rbhzb"] Feb 28 04:02:06 crc kubenswrapper[4819]: I0228 04:02:06.384906 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303e7aea-596a-4429-a4d1-e88b6737bd9b" path="/var/lib/kubelet/pods/303e7aea-596a-4429-a4d1-e88b6737bd9b/volumes" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.616936 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgqvl_afd495b6-4c71-455d-ba10-061bbf630cc5/controller/0.log" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.617849 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgqvl_afd495b6-4c71-455d-ba10-061bbf630cc5/kube-rbac-proxy/0.log" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.770483 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.908429 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.929174 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.942758 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 04:02:26 crc kubenswrapper[4819]: I0228 04:02:26.962173 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.113501 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.117272 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.142830 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.152588 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.330747 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-frr-files/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.350610 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-reloader/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.379399 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/controller/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.395865 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/cp-metrics/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.508372 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/frr-metrics/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.539488 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/kube-rbac-proxy/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.577609 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/kube-rbac-proxy-frr/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.732860 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/reloader/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.739865 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-xwnrs_d65903fa-28c3-4b9d-890d-7b605a91f0d8/frr-k8s-webhook-server/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.861204 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n8zsj_24de1316-bb5e-4366-a897-7d4dc6349b1f/frr/0.log" Feb 28 04:02:27 crc kubenswrapper[4819]: I0228 04:02:27.907237 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d5484c9c9-4vfr2_4384095a-345c-400a-bd62-0f8ca53b1ea3/manager/0.log" Feb 28 04:02:28 crc kubenswrapper[4819]: I0228 04:02:28.000216 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-644d79b54d-t68wv_55049d53-0610-4ce3-843d-209930ea1421/webhook-server/0.log" Feb 28 04:02:28 crc kubenswrapper[4819]: I0228 04:02:28.077976 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zt2hg_3257ff95-9460-49e8-8ae9-9758e419abee/kube-rbac-proxy/0.log" Feb 28 04:02:28 crc kubenswrapper[4819]: I0228 04:02:28.223041 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zt2hg_3257ff95-9460-49e8-8ae9-9758e419abee/speaker/0.log" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.041605 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-swwwk"] Feb 28 04:02:47 crc kubenswrapper[4819]: E0228 04:02:47.042619 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014d9911-5e1b-466c-a29f-ab80661525c8" containerName="oc" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.042641 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="014d9911-5e1b-466c-a29f-ab80661525c8" containerName="oc" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.042844 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="014d9911-5e1b-466c-a29f-ab80661525c8" containerName="oc" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.044541 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.060092 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swwwk"] Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.195269 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-utilities\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.195570 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ggv4\" (UniqueName: \"kubernetes.io/projected/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-kube-api-access-6ggv4\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.195650 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-catalog-content\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.297036 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-utilities\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.297089 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ggv4\" (UniqueName: \"kubernetes.io/projected/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-kube-api-access-6ggv4\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.297113 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-catalog-content\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.297548 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-catalog-content\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.297915 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-utilities\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.317155 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ggv4\" (UniqueName: \"kubernetes.io/projected/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-kube-api-access-6ggv4\") pod \"community-operators-swwwk\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.362599 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:47 crc kubenswrapper[4819]: I0228 04:02:47.830691 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swwwk"] Feb 28 04:02:48 crc kubenswrapper[4819]: I0228 04:02:48.511424 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerStarted","Data":"39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab"} Feb 28 04:02:48 crc kubenswrapper[4819]: I0228 04:02:48.511821 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerStarted","Data":"acf7fef8caba2d8327569b187771be91f09d78d5845f0dc084068ea569a81f39"} Feb 28 04:02:49 crc kubenswrapper[4819]: I0228 04:02:49.518864 4819 generic.go:334] "Generic (PLEG): container finished" podID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerID="39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab" exitCode=0 Feb 28 04:02:49 crc kubenswrapper[4819]: I0228 04:02:49.518909 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerDied","Data":"39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab"} Feb 28 04:02:50 crc kubenswrapper[4819]: I0228 04:02:50.527866 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerStarted","Data":"63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf"} Feb 28 04:02:51 crc kubenswrapper[4819]: I0228 04:02:51.538199 4819 generic.go:334] "Generic (PLEG): container finished" podID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerID="63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf" exitCode=0 Feb 28 04:02:51 crc kubenswrapper[4819]: I0228 04:02:51.538283 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerDied","Data":"63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf"} Feb 28 04:02:52 crc kubenswrapper[4819]: I0228 04:02:52.002445 4819 scope.go:117] "RemoveContainer" containerID="2b5db0da62d02cc5c546f3fa90b2354a54c282131e81aced30a9d5dfd7646a76" Feb 28 04:02:52 crc kubenswrapper[4819]: I0228 04:02:52.548332 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerStarted","Data":"10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00"} Feb 28 04:02:52 crc kubenswrapper[4819]: I0228 04:02:52.573625 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-swwwk" podStartSLOduration=3.185215465 podStartE2EDuration="5.573609553s" podCreationTimestamp="2026-02-28 04:02:47 +0000 UTC" firstStartedPulling="2026-02-28 04:02:49.520268725 +0000 UTC m=+1707.985837593" lastFinishedPulling="2026-02-28 04:02:51.908662793 +0000 UTC m=+1710.374231681" observedRunningTime="2026-02-28 04:02:52.570534548 +0000 UTC m=+1711.036103406" watchObservedRunningTime="2026-02-28 04:02:52.573609553 +0000 UTC m=+1711.039178411" Feb 28 04:02:55 crc kubenswrapper[4819]: I0228 04:02:55.758034 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-utilities/0.log" Feb 28 04:02:55 crc kubenswrapper[4819]: I0228 04:02:55.936519 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-utilities/0.log" Feb 28 04:02:55 crc kubenswrapper[4819]: I0228 04:02:55.968426 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-content/0.log" Feb 28 04:02:55 crc kubenswrapper[4819]: I0228 04:02:55.995632 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-content/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.185996 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-content/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.189632 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/extract-utilities/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.397222 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-utilities/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.577480 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-utilities/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.612821 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-content/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.649649 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-content/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.664922 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t9qpv_b971502b-767d-422a-94cf-71377b40763d/registry-server/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.786086 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-content/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.787596 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/extract-utilities/0.log" Feb 28 04:02:56 crc kubenswrapper[4819]: I0228 04:02:56.974503 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/extract-utilities/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.086333 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fxn87_97da816f-e1cf-43dc-b0ae-d78c31a33a19/registry-server/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.144974 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/extract-content/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.168900 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/extract-utilities/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.201008 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/extract-content/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.342599 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/extract-utilities/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.348346 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/extract-content/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.363301 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.363351 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.377562 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-swwwk_e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/registry-server/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.402961 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.511541 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/util/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.615602 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.660216 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swwwk"] Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.685911 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/pull/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.686294 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/util/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.730008 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/pull/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.863217 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/extract/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.864788 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/util/0.log" Feb 28 04:02:57 crc kubenswrapper[4819]: I0228 04:02:57.872100 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4sd9vq_434b71de-3f2c-4820-943d-4b3b20e82fe2/pull/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.007918 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wnnt5_15512f8d-a53e-47cb-9b22-b8f8f410d65d/marketplace-operator/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.046392 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-utilities/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.209629 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-utilities/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.213799 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-content/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.214990 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-content/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.375457 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-utilities/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.407649 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/extract-content/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.470028 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s552m_cb9d52af-88db-45b1-9d8f-2023d6116b4d/registry-server/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.537985 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-utilities/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.725535 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-utilities/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.746933 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-content/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.747476 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-content/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.893709 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-utilities/0.log" Feb 28 04:02:58 crc kubenswrapper[4819]: I0228 04:02:58.910069 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/extract-content/0.log" Feb 28 04:02:59 crc kubenswrapper[4819]: I0228 04:02:59.188743 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9fft7_a5d1a220-cc5c-4631-b4ef-2fc4321b1b7d/registry-server/0.log" Feb 28 04:02:59 crc kubenswrapper[4819]: I0228 04:02:59.586668 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-swwwk" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="registry-server" containerID="cri-o://10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00" gracePeriod=2 Feb 28 04:02:59 crc kubenswrapper[4819]: I0228 04:02:59.936292 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.085628 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ggv4\" (UniqueName: \"kubernetes.io/projected/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-kube-api-access-6ggv4\") pod \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.085744 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-catalog-content\") pod \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.085786 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-utilities\") pod \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\" (UID: \"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9\") " Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.087792 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-utilities" (OuterVolumeSpecName: "utilities") pod "e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" (UID: "e8a313e1-b4ca-46e9-a288-7091a4b9d1f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.103516 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-kube-api-access-6ggv4" (OuterVolumeSpecName: "kube-api-access-6ggv4") pod "e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" (UID: "e8a313e1-b4ca-46e9-a288-7091a4b9d1f9"). InnerVolumeSpecName "kube-api-access-6ggv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.171691 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" (UID: "e8a313e1-b4ca-46e9-a288-7091a4b9d1f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.187718 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ggv4\" (UniqueName: \"kubernetes.io/projected/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-kube-api-access-6ggv4\") on node \"crc\" DevicePath \"\"" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.187762 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.187779 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.596723 4819 generic.go:334] "Generic (PLEG): container finished" podID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerID="10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00" exitCode=0 Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.596784 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerDied","Data":"10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00"} Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.596807 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swwwk" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.596827 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swwwk" event={"ID":"e8a313e1-b4ca-46e9-a288-7091a4b9d1f9","Type":"ContainerDied","Data":"acf7fef8caba2d8327569b187771be91f09d78d5845f0dc084068ea569a81f39"} Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.596856 4819 scope.go:117] "RemoveContainer" containerID="10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.622552 4819 scope.go:117] "RemoveContainer" containerID="63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf" Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.638537 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swwwk"] Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.638595 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-swwwk"] Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.991628 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:03:00 crc kubenswrapper[4819]: I0228 04:03:00.992583 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.013098 4819 scope.go:117] "RemoveContainer" containerID="39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.038546 4819 scope.go:117] "RemoveContainer" containerID="10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00" Feb 28 04:03:01 crc kubenswrapper[4819]: E0228 04:03:01.039228 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00\": container with ID starting with 10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00 not found: ID does not exist" containerID="10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.039305 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00"} err="failed to get container status \"10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00\": rpc error: code = NotFound desc = could not find container \"10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00\": container with ID starting with 10ab891a14e2f13a747aad02fa2076e92659ce33f70e8630cef217b2770fcd00 not found: ID does not exist" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.039338 4819 scope.go:117] "RemoveContainer" containerID="63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf" Feb 28 04:03:01 crc kubenswrapper[4819]: E0228 04:03:01.039938 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf\": container with ID starting with 63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf not found: ID does not exist" containerID="63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.039990 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf"} err="failed to get container status \"63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf\": rpc error: code = NotFound desc = could not find container \"63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf\": container with ID starting with 63f762a00bdab0622e1bdc88d982e7775b1cded8eb4fcba5505f368966260edf not found: ID does not exist" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.040029 4819 scope.go:117] "RemoveContainer" containerID="39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab" Feb 28 04:03:01 crc kubenswrapper[4819]: E0228 04:03:01.040485 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab\": container with ID starting with 39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab not found: ID does not exist" containerID="39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab" Feb 28 04:03:01 crc kubenswrapper[4819]: I0228 04:03:01.040521 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab"} err="failed to get container status \"39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab\": rpc error: code = NotFound desc = could not find container \"39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab\": container with ID starting with 39554cec068b9dff1fdfa301faf8bad2ca5457dc7f01fd6f2b8adbb61fe513ab not found: ID does not exist" Feb 28 04:03:02 crc kubenswrapper[4819]: I0228 04:03:02.382856 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" path="/var/lib/kubelet/pods/e8a313e1-b4ca-46e9-a288-7091a4b9d1f9/volumes" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.745910 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-prp86"] Feb 28 04:03:17 crc kubenswrapper[4819]: E0228 04:03:17.748039 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="extract-content" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.748068 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="extract-content" Feb 28 04:03:17 crc kubenswrapper[4819]: E0228 04:03:17.748130 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="registry-server" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.748144 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="registry-server" Feb 28 04:03:17 crc kubenswrapper[4819]: E0228 04:03:17.748159 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="extract-utilities" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.748173 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="extract-utilities" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.748411 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a313e1-b4ca-46e9-a288-7091a4b9d1f9" containerName="registry-server" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.750159 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.759512 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prp86"] Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.843288 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-utilities\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.843476 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-catalog-content\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.843924 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7db\" (UniqueName: \"kubernetes.io/projected/5a1b8162-2c18-4998-80e5-5af5a0c600b3-kube-api-access-dl7db\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.945774 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7db\" (UniqueName: \"kubernetes.io/projected/5a1b8162-2c18-4998-80e5-5af5a0c600b3-kube-api-access-dl7db\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.945864 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-utilities\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.945924 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-catalog-content\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.946532 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-utilities\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.946728 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-catalog-content\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:17 crc kubenswrapper[4819]: I0228 04:03:17.983303 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7db\" (UniqueName: \"kubernetes.io/projected/5a1b8162-2c18-4998-80e5-5af5a0c600b3-kube-api-access-dl7db\") pod \"redhat-marketplace-prp86\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:18 crc kubenswrapper[4819]: I0228 04:03:18.083734 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:18 crc kubenswrapper[4819]: I0228 04:03:18.356208 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-prp86"] Feb 28 04:03:18 crc kubenswrapper[4819]: I0228 04:03:18.721064 4819 generic.go:334] "Generic (PLEG): container finished" podID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerID="2cf5687e3a788a76670e99535e19b6c21769ede477196ec2aac9a4aa5a406b80" exitCode=0 Feb 28 04:03:18 crc kubenswrapper[4819]: I0228 04:03:18.721134 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prp86" event={"ID":"5a1b8162-2c18-4998-80e5-5af5a0c600b3","Type":"ContainerDied","Data":"2cf5687e3a788a76670e99535e19b6c21769ede477196ec2aac9a4aa5a406b80"} Feb 28 04:03:18 crc kubenswrapper[4819]: I0228 04:03:18.721353 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prp86" event={"ID":"5a1b8162-2c18-4998-80e5-5af5a0c600b3","Type":"ContainerStarted","Data":"352744f70b2a8e1710e47111e33091ed8b7e623eb204093737bbb17117e89ee2"} Feb 28 04:03:19 crc kubenswrapper[4819]: I0228 04:03:19.731315 4819 generic.go:334] "Generic (PLEG): container finished" podID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerID="785aed6b5c0a0b973d116f35b43db60454c3f6e2411ede4e965a9486b05e8838" exitCode=0 Feb 28 04:03:19 crc kubenswrapper[4819]: I0228 04:03:19.731923 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prp86" event={"ID":"5a1b8162-2c18-4998-80e5-5af5a0c600b3","Type":"ContainerDied","Data":"785aed6b5c0a0b973d116f35b43db60454c3f6e2411ede4e965a9486b05e8838"} Feb 28 04:03:20 crc kubenswrapper[4819]: I0228 04:03:20.740094 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prp86" event={"ID":"5a1b8162-2c18-4998-80e5-5af5a0c600b3","Type":"ContainerStarted","Data":"445f7e506b1f40fd05238cfcc1f258c9d979c539c166b74a5a6dfb983fbaca8d"} Feb 28 04:03:28 crc kubenswrapper[4819]: I0228 04:03:28.084006 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:28 crc kubenswrapper[4819]: I0228 04:03:28.084630 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:28 crc kubenswrapper[4819]: I0228 04:03:28.171093 4819 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:28 crc kubenswrapper[4819]: I0228 04:03:28.207231 4819 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-prp86" podStartSLOduration=9.77845322 podStartE2EDuration="11.20720334s" podCreationTimestamp="2026-02-28 04:03:17 +0000 UTC" firstStartedPulling="2026-02-28 04:03:18.722352495 +0000 UTC m=+1737.187921353" lastFinishedPulling="2026-02-28 04:03:20.151102595 +0000 UTC m=+1738.616671473" observedRunningTime="2026-02-28 04:03:20.774756527 +0000 UTC m=+1739.240325395" watchObservedRunningTime="2026-02-28 04:03:28.20720334 +0000 UTC m=+1746.672772228" Feb 28 04:03:28 crc kubenswrapper[4819]: I0228 04:03:28.855899 4819 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:28 crc kubenswrapper[4819]: I0228 04:03:28.934657 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prp86"] Feb 28 04:03:30 crc kubenswrapper[4819]: I0228 04:03:30.808729 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-prp86" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="registry-server" containerID="cri-o://445f7e506b1f40fd05238cfcc1f258c9d979c539c166b74a5a6dfb983fbaca8d" gracePeriod=2 Feb 28 04:03:30 crc kubenswrapper[4819]: I0228 04:03:30.834419 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:03:30 crc kubenswrapper[4819]: I0228 04:03:30.834490 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:03:31 crc kubenswrapper[4819]: I0228 04:03:31.834017 4819 generic.go:334] "Generic (PLEG): container finished" podID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerID="445f7e506b1f40fd05238cfcc1f258c9d979c539c166b74a5a6dfb983fbaca8d" exitCode=0 Feb 28 04:03:31 crc kubenswrapper[4819]: I0228 04:03:31.834207 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prp86" event={"ID":"5a1b8162-2c18-4998-80e5-5af5a0c600b3","Type":"ContainerDied","Data":"445f7e506b1f40fd05238cfcc1f258c9d979c539c166b74a5a6dfb983fbaca8d"} Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.063421 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.213433 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-utilities\") pod \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.213469 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl7db\" (UniqueName: \"kubernetes.io/projected/5a1b8162-2c18-4998-80e5-5af5a0c600b3-kube-api-access-dl7db\") pod \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.213561 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-catalog-content\") pod \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\" (UID: \"5a1b8162-2c18-4998-80e5-5af5a0c600b3\") " Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.214137 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-utilities" (OuterVolumeSpecName: "utilities") pod "5a1b8162-2c18-4998-80e5-5af5a0c600b3" (UID: "5a1b8162-2c18-4998-80e5-5af5a0c600b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.233544 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1b8162-2c18-4998-80e5-5af5a0c600b3-kube-api-access-dl7db" (OuterVolumeSpecName: "kube-api-access-dl7db") pod "5a1b8162-2c18-4998-80e5-5af5a0c600b3" (UID: "5a1b8162-2c18-4998-80e5-5af5a0c600b3"). InnerVolumeSpecName "kube-api-access-dl7db". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.238450 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a1b8162-2c18-4998-80e5-5af5a0c600b3" (UID: "5a1b8162-2c18-4998-80e5-5af5a0c600b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.315124 4819 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.315196 4819 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b8162-2c18-4998-80e5-5af5a0c600b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.315227 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl7db\" (UniqueName: \"kubernetes.io/projected/5a1b8162-2c18-4998-80e5-5af5a0c600b3-kube-api-access-dl7db\") on node \"crc\" DevicePath \"\"" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.847010 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-prp86" event={"ID":"5a1b8162-2c18-4998-80e5-5af5a0c600b3","Type":"ContainerDied","Data":"352744f70b2a8e1710e47111e33091ed8b7e623eb204093737bbb17117e89ee2"} Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.847102 4819 scope.go:117] "RemoveContainer" containerID="445f7e506b1f40fd05238cfcc1f258c9d979c539c166b74a5a6dfb983fbaca8d" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.847335 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-prp86" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.876375 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-prp86"] Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.883940 4819 scope.go:117] "RemoveContainer" containerID="785aed6b5c0a0b973d116f35b43db60454c3f6e2411ede4e965a9486b05e8838" Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.891069 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-prp86"] Feb 28 04:03:32 crc kubenswrapper[4819]: I0228 04:03:32.911050 4819 scope.go:117] "RemoveContainer" containerID="2cf5687e3a788a76670e99535e19b6c21769ede477196ec2aac9a4aa5a406b80" Feb 28 04:03:34 crc kubenswrapper[4819]: I0228 04:03:34.384079 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" path="/var/lib/kubelet/pods/5a1b8162-2c18-4998-80e5-5af5a0c600b3/volumes" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.184867 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537524-5lmhk"] Feb 28 04:04:00 crc kubenswrapper[4819]: E0228 04:04:00.185405 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="registry-server" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.185422 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="registry-server" Feb 28 04:04:00 crc kubenswrapper[4819]: E0228 04:04:00.185435 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="extract-content" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.185443 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="extract-content" Feb 28 04:04:00 crc kubenswrapper[4819]: E0228 04:04:00.185457 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="extract-utilities" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.185465 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="extract-utilities" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.187062 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b8162-2c18-4998-80e5-5af5a0c600b3" containerName="registry-server" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.191414 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.198154 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.199034 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.202437 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537524-5lmhk"] Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.204479 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.281033 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvjlx\" (UniqueName: \"kubernetes.io/projected/330e3dd4-daea-4afd-9f27-612b98c61bbe-kube-api-access-tvjlx\") pod \"auto-csr-approver-29537524-5lmhk\" (UID: \"330e3dd4-daea-4afd-9f27-612b98c61bbe\") " pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.381790 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvjlx\" (UniqueName: \"kubernetes.io/projected/330e3dd4-daea-4afd-9f27-612b98c61bbe-kube-api-access-tvjlx\") pod \"auto-csr-approver-29537524-5lmhk\" (UID: \"330e3dd4-daea-4afd-9f27-612b98c61bbe\") " pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.427981 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvjlx\" (UniqueName: \"kubernetes.io/projected/330e3dd4-daea-4afd-9f27-612b98c61bbe-kube-api-access-tvjlx\") pod \"auto-csr-approver-29537524-5lmhk\" (UID: \"330e3dd4-daea-4afd-9f27-612b98c61bbe\") " pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.517421 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.753619 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537524-5lmhk"] Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.841201 4819 patch_prober.go:28] interesting pod/machine-config-daemon-rw4hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.841356 4819 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.841424 4819 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.842358 4819 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20"} pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 28 04:04:00 crc kubenswrapper[4819]: I0228 04:04:00.842471 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerName="machine-config-daemon" containerID="cri-o://3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" gracePeriod=600 Feb 28 04:04:01 crc kubenswrapper[4819]: E0228 04:04:01.043453 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:04:01 crc kubenswrapper[4819]: I0228 04:04:01.191940 4819 generic.go:334] "Generic (PLEG): container finished" podID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" exitCode=0 Feb 28 04:04:01 crc kubenswrapper[4819]: I0228 04:04:01.192008 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" event={"ID":"d6ad11c1-0eb7-4064-bb39-3ffb389efb90","Type":"ContainerDied","Data":"3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20"} Feb 28 04:04:01 crc kubenswrapper[4819]: I0228 04:04:01.192043 4819 scope.go:117] "RemoveContainer" containerID="9a6196e1947fe2ebe77dd10c343260071400941456b6e5fa2ac8b053fa27f275" Feb 28 04:04:01 crc kubenswrapper[4819]: I0228 04:04:01.192559 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:04:01 crc kubenswrapper[4819]: E0228 04:04:01.192850 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:04:01 crc kubenswrapper[4819]: I0228 04:04:01.193456 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" event={"ID":"330e3dd4-daea-4afd-9f27-612b98c61bbe","Type":"ContainerStarted","Data":"e590602d4bfa86fd7a1e7b65a0f7808d42056ff5432255c3c736bcf7ac5f3676"} Feb 28 04:04:02 crc kubenswrapper[4819]: I0228 04:04:02.208536 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" event={"ID":"330e3dd4-daea-4afd-9f27-612b98c61bbe","Type":"ContainerStarted","Data":"9cc11e038be52ecc6d4b4301b5239f1c622f1cc15f5879734e17f78a7a995a20"} Feb 28 04:04:03 crc kubenswrapper[4819]: I0228 04:04:03.227823 4819 generic.go:334] "Generic (PLEG): container finished" podID="330e3dd4-daea-4afd-9f27-612b98c61bbe" containerID="9cc11e038be52ecc6d4b4301b5239f1c622f1cc15f5879734e17f78a7a995a20" exitCode=0 Feb 28 04:04:03 crc kubenswrapper[4819]: I0228 04:04:03.228685 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" event={"ID":"330e3dd4-daea-4afd-9f27-612b98c61bbe","Type":"ContainerDied","Data":"9cc11e038be52ecc6d4b4301b5239f1c622f1cc15f5879734e17f78a7a995a20"} Feb 28 04:04:03 crc kubenswrapper[4819]: I0228 04:04:03.457165 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:03 crc kubenswrapper[4819]: I0228 04:04:03.624302 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvjlx\" (UniqueName: \"kubernetes.io/projected/330e3dd4-daea-4afd-9f27-612b98c61bbe-kube-api-access-tvjlx\") pod \"330e3dd4-daea-4afd-9f27-612b98c61bbe\" (UID: \"330e3dd4-daea-4afd-9f27-612b98c61bbe\") " Feb 28 04:04:03 crc kubenswrapper[4819]: I0228 04:04:03.647552 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330e3dd4-daea-4afd-9f27-612b98c61bbe-kube-api-access-tvjlx" (OuterVolumeSpecName: "kube-api-access-tvjlx") pod "330e3dd4-daea-4afd-9f27-612b98c61bbe" (UID: "330e3dd4-daea-4afd-9f27-612b98c61bbe"). InnerVolumeSpecName "kube-api-access-tvjlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:04:03 crc kubenswrapper[4819]: I0228 04:04:03.725820 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvjlx\" (UniqueName: \"kubernetes.io/projected/330e3dd4-daea-4afd-9f27-612b98c61bbe-kube-api-access-tvjlx\") on node \"crc\" DevicePath \"\"" Feb 28 04:04:04 crc kubenswrapper[4819]: I0228 04:04:04.238737 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" event={"ID":"330e3dd4-daea-4afd-9f27-612b98c61bbe","Type":"ContainerDied","Data":"e590602d4bfa86fd7a1e7b65a0f7808d42056ff5432255c3c736bcf7ac5f3676"} Feb 28 04:04:04 crc kubenswrapper[4819]: I0228 04:04:04.238767 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537524-5lmhk" Feb 28 04:04:04 crc kubenswrapper[4819]: I0228 04:04:04.238794 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e590602d4bfa86fd7a1e7b65a0f7808d42056ff5432255c3c736bcf7ac5f3676" Feb 28 04:04:04 crc kubenswrapper[4819]: I0228 04:04:04.588790 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537518-7v29k"] Feb 28 04:04:04 crc kubenswrapper[4819]: I0228 04:04:04.593067 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537518-7v29k"] Feb 28 04:04:06 crc kubenswrapper[4819]: I0228 04:04:06.381356 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="115b12b9-5377-4f22-b13b-a5aaf42dd570" path="/var/lib/kubelet/pods/115b12b9-5377-4f22-b13b-a5aaf42dd570/volumes" Feb 28 04:04:11 crc kubenswrapper[4819]: I0228 04:04:11.285088 4819 generic.go:334] "Generic (PLEG): container finished" podID="8f0a6237-c182-4355-9060-801c74dbe662" containerID="b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2" exitCode=0 Feb 28 04:04:11 crc kubenswrapper[4819]: I0228 04:04:11.285187 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6jrnw/must-gather-sph8h" event={"ID":"8f0a6237-c182-4355-9060-801c74dbe662","Type":"ContainerDied","Data":"b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2"} Feb 28 04:04:11 crc kubenswrapper[4819]: I0228 04:04:11.287010 4819 scope.go:117] "RemoveContainer" containerID="b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2" Feb 28 04:04:11 crc kubenswrapper[4819]: I0228 04:04:11.498088 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6jrnw_must-gather-sph8h_8f0a6237-c182-4355-9060-801c74dbe662/gather/0.log" Feb 28 04:04:13 crc kubenswrapper[4819]: I0228 04:04:13.369989 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:04:13 crc kubenswrapper[4819]: E0228 04:04:13.370603 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:04:15 crc kubenswrapper[4819]: E0228 04:04:15.302060 4819 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:41114->38.102.83.212:46245: write tcp 38.102.83.212:41114->38.102.83.212:46245: write: broken pipe Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.413896 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6jrnw/must-gather-sph8h"] Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.414802 4819 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6jrnw/must-gather-sph8h" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="copy" containerID="cri-o://d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae" gracePeriod=2 Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.423411 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6jrnw/must-gather-sph8h"] Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.764993 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6jrnw_must-gather-sph8h_8f0a6237-c182-4355-9060-801c74dbe662/copy/0.log" Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.765539 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.784641 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f0a6237-c182-4355-9060-801c74dbe662-must-gather-output\") pod \"8f0a6237-c182-4355-9060-801c74dbe662\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.784678 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6drqc\" (UniqueName: \"kubernetes.io/projected/8f0a6237-c182-4355-9060-801c74dbe662-kube-api-access-6drqc\") pod \"8f0a6237-c182-4355-9060-801c74dbe662\" (UID: \"8f0a6237-c182-4355-9060-801c74dbe662\") " Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.792313 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0a6237-c182-4355-9060-801c74dbe662-kube-api-access-6drqc" (OuterVolumeSpecName: "kube-api-access-6drqc") pod "8f0a6237-c182-4355-9060-801c74dbe662" (UID: "8f0a6237-c182-4355-9060-801c74dbe662"). InnerVolumeSpecName "kube-api-access-6drqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.854602 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0a6237-c182-4355-9060-801c74dbe662-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8f0a6237-c182-4355-9060-801c74dbe662" (UID: "8f0a6237-c182-4355-9060-801c74dbe662"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.885619 4819 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8f0a6237-c182-4355-9060-801c74dbe662-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 28 04:04:20 crc kubenswrapper[4819]: I0228 04:04:20.885661 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6drqc\" (UniqueName: \"kubernetes.io/projected/8f0a6237-c182-4355-9060-801c74dbe662-kube-api-access-6drqc\") on node \"crc\" DevicePath \"\"" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.358558 4819 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6jrnw_must-gather-sph8h_8f0a6237-c182-4355-9060-801c74dbe662/copy/0.log" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.359036 4819 generic.go:334] "Generic (PLEG): container finished" podID="8f0a6237-c182-4355-9060-801c74dbe662" containerID="d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae" exitCode=143 Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.359088 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6jrnw/must-gather-sph8h" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.359094 4819 scope.go:117] "RemoveContainer" containerID="d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.377530 4819 scope.go:117] "RemoveContainer" containerID="b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.420683 4819 scope.go:117] "RemoveContainer" containerID="d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae" Feb 28 04:04:21 crc kubenswrapper[4819]: E0228 04:04:21.422308 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae\": container with ID starting with d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae not found: ID does not exist" containerID="d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.422364 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae"} err="failed to get container status \"d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae\": rpc error: code = NotFound desc = could not find container \"d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae\": container with ID starting with d0b031c786179518ad272c7cc08ef5c9e225ebe1f66d0d128e00cddb92c6a5ae not found: ID does not exist" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.422392 4819 scope.go:117] "RemoveContainer" containerID="b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2" Feb 28 04:04:21 crc kubenswrapper[4819]: E0228 04:04:21.422843 4819 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2\": container with ID starting with b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2 not found: ID does not exist" containerID="b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2" Feb 28 04:04:21 crc kubenswrapper[4819]: I0228 04:04:21.422942 4819 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2"} err="failed to get container status \"b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2\": rpc error: code = NotFound desc = could not find container \"b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2\": container with ID starting with b3b4a624919a95d2dc7726ad896da455eec3329546d82ff6841a78c66acf9cc2 not found: ID does not exist" Feb 28 04:04:22 crc kubenswrapper[4819]: I0228 04:04:22.387434 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0a6237-c182-4355-9060-801c74dbe662" path="/var/lib/kubelet/pods/8f0a6237-c182-4355-9060-801c74dbe662/volumes" Feb 28 04:04:25 crc kubenswrapper[4819]: I0228 04:04:25.369070 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:04:25 crc kubenswrapper[4819]: E0228 04:04:25.370380 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:04:39 crc kubenswrapper[4819]: I0228 04:04:39.368790 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:04:39 crc kubenswrapper[4819]: E0228 04:04:39.369533 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:04:50 crc kubenswrapper[4819]: I0228 04:04:50.368867 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:04:50 crc kubenswrapper[4819]: E0228 04:04:50.369880 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:04:52 crc kubenswrapper[4819]: I0228 04:04:52.122132 4819 scope.go:117] "RemoveContainer" containerID="5c4a18c64163ab0cf963cbaee6e85eeff2525909ce1699af73c57ec1a9508648" Feb 28 04:05:03 crc kubenswrapper[4819]: I0228 04:05:03.368954 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:05:03 crc kubenswrapper[4819]: E0228 04:05:03.370202 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:05:18 crc kubenswrapper[4819]: I0228 04:05:18.368842 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:05:18 crc kubenswrapper[4819]: E0228 04:05:18.369385 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:05:33 crc kubenswrapper[4819]: I0228 04:05:33.368617 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:05:33 crc kubenswrapper[4819]: E0228 04:05:33.369638 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:05:48 crc kubenswrapper[4819]: I0228 04:05:48.369753 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:05:48 crc kubenswrapper[4819]: E0228 04:05:48.371064 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.151199 4819 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29537526-gs98n"] Feb 28 04:06:00 crc kubenswrapper[4819]: E0228 04:06:00.156413 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="copy" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.156579 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="copy" Feb 28 04:06:00 crc kubenswrapper[4819]: E0228 04:06:00.156712 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330e3dd4-daea-4afd-9f27-612b98c61bbe" containerName="oc" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.156837 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="330e3dd4-daea-4afd-9f27-612b98c61bbe" containerName="oc" Feb 28 04:06:00 crc kubenswrapper[4819]: E0228 04:06:00.156962 4819 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="gather" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.157072 4819 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="gather" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.157388 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="330e3dd4-daea-4afd-9f27-612b98c61bbe" containerName="oc" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.157525 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="copy" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.157656 4819 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0a6237-c182-4355-9060-801c74dbe662" containerName="gather" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.158528 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.160129 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537526-gs98n"] Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.161688 4819 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tsxgw" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.161978 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.166624 4819 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.280890 4819 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q8m\" (UniqueName: \"kubernetes.io/projected/487741ac-76e3-4252-9974-742dc007ef84-kube-api-access-k5q8m\") pod \"auto-csr-approver-29537526-gs98n\" (UID: \"487741ac-76e3-4252-9974-742dc007ef84\") " pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.368568 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:06:00 crc kubenswrapper[4819]: E0228 04:06:00.368895 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.381835 4819 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q8m\" (UniqueName: \"kubernetes.io/projected/487741ac-76e3-4252-9974-742dc007ef84-kube-api-access-k5q8m\") pod \"auto-csr-approver-29537526-gs98n\" (UID: \"487741ac-76e3-4252-9974-742dc007ef84\") " pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.416425 4819 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q8m\" (UniqueName: \"kubernetes.io/projected/487741ac-76e3-4252-9974-742dc007ef84-kube-api-access-k5q8m\") pod \"auto-csr-approver-29537526-gs98n\" (UID: \"487741ac-76e3-4252-9974-742dc007ef84\") " pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.477917 4819 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:00 crc kubenswrapper[4819]: I0228 04:06:00.715173 4819 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29537526-gs98n"] Feb 28 04:06:01 crc kubenswrapper[4819]: I0228 04:06:01.101049 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537526-gs98n" event={"ID":"487741ac-76e3-4252-9974-742dc007ef84","Type":"ContainerStarted","Data":"70714a02853e40ad0b0c75de8c643ce75b2ea583c21ae4570e33423e5b198bbb"} Feb 28 04:06:03 crc kubenswrapper[4819]: I0228 04:06:03.115944 4819 generic.go:334] "Generic (PLEG): container finished" podID="487741ac-76e3-4252-9974-742dc007ef84" containerID="415d09d777f592ee09ce12855a549106af8d858bae79d7992ccbb87dfe859b1d" exitCode=0 Feb 28 04:06:03 crc kubenswrapper[4819]: I0228 04:06:03.116041 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537526-gs98n" event={"ID":"487741ac-76e3-4252-9974-742dc007ef84","Type":"ContainerDied","Data":"415d09d777f592ee09ce12855a549106af8d858bae79d7992ccbb87dfe859b1d"} Feb 28 04:06:04 crc kubenswrapper[4819]: I0228 04:06:04.499745 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:04 crc kubenswrapper[4819]: I0228 04:06:04.571981 4819 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5q8m\" (UniqueName: \"kubernetes.io/projected/487741ac-76e3-4252-9974-742dc007ef84-kube-api-access-k5q8m\") pod \"487741ac-76e3-4252-9974-742dc007ef84\" (UID: \"487741ac-76e3-4252-9974-742dc007ef84\") " Feb 28 04:06:04 crc kubenswrapper[4819]: I0228 04:06:04.576887 4819 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487741ac-76e3-4252-9974-742dc007ef84-kube-api-access-k5q8m" (OuterVolumeSpecName: "kube-api-access-k5q8m") pod "487741ac-76e3-4252-9974-742dc007ef84" (UID: "487741ac-76e3-4252-9974-742dc007ef84"). InnerVolumeSpecName "kube-api-access-k5q8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 28 04:06:04 crc kubenswrapper[4819]: I0228 04:06:04.673708 4819 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5q8m\" (UniqueName: \"kubernetes.io/projected/487741ac-76e3-4252-9974-742dc007ef84-kube-api-access-k5q8m\") on node \"crc\" DevicePath \"\"" Feb 28 04:06:05 crc kubenswrapper[4819]: I0228 04:06:05.134715 4819 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29537526-gs98n" event={"ID":"487741ac-76e3-4252-9974-742dc007ef84","Type":"ContainerDied","Data":"70714a02853e40ad0b0c75de8c643ce75b2ea583c21ae4570e33423e5b198bbb"} Feb 28 04:06:05 crc kubenswrapper[4819]: I0228 04:06:05.134781 4819 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70714a02853e40ad0b0c75de8c643ce75b2ea583c21ae4570e33423e5b198bbb" Feb 28 04:06:05 crc kubenswrapper[4819]: I0228 04:06:05.134803 4819 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29537526-gs98n" Feb 28 04:06:05 crc kubenswrapper[4819]: I0228 04:06:05.580918 4819 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29537520-bxcj7"] Feb 28 04:06:05 crc kubenswrapper[4819]: I0228 04:06:05.589008 4819 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29537520-bxcj7"] Feb 28 04:06:06 crc kubenswrapper[4819]: I0228 04:06:06.381182 4819 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ac5ac-3d6d-446a-97c7-a5012a033c71" path="/var/lib/kubelet/pods/c03ac5ac-3d6d-446a-97c7-a5012a033c71/volumes" Feb 28 04:06:13 crc kubenswrapper[4819]: I0228 04:06:13.369291 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:06:13 crc kubenswrapper[4819]: E0228 04:06:13.370031 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:06:24 crc kubenswrapper[4819]: I0228 04:06:24.368673 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:06:24 crc kubenswrapper[4819]: E0228 04:06:24.369651 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:06:37 crc kubenswrapper[4819]: I0228 04:06:37.369234 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:06:37 crc kubenswrapper[4819]: E0228 04:06:37.371096 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:06:49 crc kubenswrapper[4819]: I0228 04:06:49.368664 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:06:49 crc kubenswrapper[4819]: E0228 04:06:49.369876 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" Feb 28 04:06:52 crc kubenswrapper[4819]: I0228 04:06:52.240457 4819 scope.go:117] "RemoveContainer" containerID="35b6f53f4f976b9ab4aeefee60f884944c9d8b2149aa679163d0550e580c4f86" Feb 28 04:07:02 crc kubenswrapper[4819]: I0228 04:07:02.376228 4819 scope.go:117] "RemoveContainer" containerID="3b9ce8c9cbc4c2ee6628a27ebc58f92e72df8c945aaa2beb8a325b06e4a29d20" Feb 28 04:07:02 crc kubenswrapper[4819]: E0228 04:07:02.377148 4819 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rw4hn_openshift-machine-config-operator(d6ad11c1-0eb7-4064-bb39-3ffb389efb90)\"" pod="openshift-machine-config-operator/machine-config-daemon-rw4hn" podUID="d6ad11c1-0eb7-4064-bb39-3ffb389efb90" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150464763024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150464764017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150460637016514 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150460640015456 5ustar corecore